Ignorance in Climate Science

Jerome Ravetz wrote Scientific Knowledge and its Social Problems (1971, 1996) and (with Silvio Funtowicz) Uncertainty and Quality in Science for Policy  (1990).  They created the NUSAP notational system and the theory of Post-Normal Science.  He is currently associated with the Institute for Science, Innovation and Society at Oxford University.

Our modern scientific view of knowledge was defined by a throwaway line in Descartes’ Discourse on Method.  Referring to his dissatisfaction with his education at school, he claimed,

“I was convinced I had advanced no further in all my attempts at learning, than the discovery at every turn of my own ignorance”. 

He was careful to say that his school was not to blame, although a little later he did a brilliant assassination job on the whole humanistic curriculum.  Readers now might not notice the irony in Descartes’ complaint.  It was not merely another case of late-adolescent angst.  For in the mention of the discovery of ignorance, his  educated readers would have recognised an echo of Socrates.  This founder of philosophy was remembered as saying that his whole life’s work was the discovery of his ignorance.  By the criteria of Socrates and all who followed, the education of the young Descartes had been a great success:  so early in life he had succeeded in discovering his ignorance!  With both Descartes and his readers knowing this background, they would recognise his complaint as the casual discarding a couple of millennia of moral philosophy.  “Know thyself” was out, “Discover truth” was in.

This point is not of merely scholarly historical interest.  The Scientific Revolution produced a variety of accounts of scientific knowledge, differing in their balance of reason and experience, and also in the strength of their claims to certainty.  But they all agreed in their tacit elimination of ignorance from their pictures of the acquisition of knowledge.  Of course, publicists for science recognise ignorance, but mainly as something out there to be conquered by the advance of science.  When scientists have undergone a lengthy and rigorous training in which they learn that for every real problem there always one (and only one) correct answer, there is little danger of them sharing Descartes’ school-leaver’s predicament.

The relevance of this issue today is, to what extent should we incorporate ignorance, as distinct from tameable uncertainty, into our reasonings about science and science policy?  I would argue that the suppression of ignorance in our debates, perhaps even its repression in our thinking, seriously impedes our management of our scientific affairs.

There is evidence that, particularly in climate science, ignorance is something of a taboo idea, even when it might seem to be most relevant.  I have two illustrative examples from the climate science area.  The first relates to a proposed scale of uncertainty, designed by James Risbey and Milind Kandlikar [1], and adopted by the IPCC [2].  This has the merit of providing a single robust scale of degrees of uncertainty, based on the notations for expressing it in numerical form.  It could be of great use in resolving the confusing variety of schemes that are employed in the various special fields that contribute to climate science.  The scale includes five degrees of increasing uncertainty, concluding with a sixth category for ignorance.  The authors were pleased to see the scale adopted by the IPCC, but then surprised to see that the category for ignorance had been deleted in the IPCC version [3].

Another example provides even stronger evidence of a consistent attitude.  Two authors who are eminent in their own fields, Sir Nicholas Stern and Leonard Smith, recently published a paper on the characterisation of uncertainty in climate science [4].  The paper is truly magisterial, bringing deep analytical clarity to this very confused subject.  But, again surprisingly, a search for ‘ignorance’ in the text produces only three citations, and two of those are incidental (p. 16 twice).  The only substantive reference relates ‘ignorance’, rather ‘recognised ignorance’, back to ‘ambiguity’ or ‘Knightian uncertainty’ (p.4).  It would seem that ignorance, in its own right as a qualitatively deeper sort of uncertainty, is not relevant here.   The absence must be deliberate, for the whole essay can be read as a detailed warning of the many pitfalls of mismanagement of uncertainty, along with the ‘fallacy of misplaced concreteness’ in relation to models. Indeed , it can be read as a Socratic exercise in all but name and vocabulary.

Particularly for that reason, I confess that I cannot agree with the absence of ignorance.  Suppose that a senior planner, responsible for the long-range defences of the Thames Estuary, approaches experts for an estimate of the sea-level rise to the end of the century.  It would be technically correct to say, “It will probably be somewhere between one and four metres, but where inbetween is a matter of ambiguity”.  The planner might prefer to be told simply, “`I don’t know,” with a review of the reasons for speculating on the likelihood of one range of values over another.

It is not as if ignorance were totally banned from policy-relevant science.  In medicine, for example, we know that we don’t know the causes of some important diseases, as indeed we are aware of our ignorance of the course of future epidemics.  The sciences do not lose public prestige because of their frankness about their deep limitations in relation to some urgent issues.  Rather, they gain trust because of their honesty with their publics.

We can see the explicit recognition of ignorance as part of the programme of a ‘technology of humility’ proposed by  Sheila Jasanoff of Harvard University [5].  It would fit particularly well with climate science, since this is after all a part of a great humanitarian project rather than a quest for profit, power or privilege.  The message of Socrates, rejected with such ultimately devastating effect by Descartes, could inform such a science and provide it with an enriching humane element.

References

[1] Risbey, J.  & M. Kandlikar, 2007: Expressions of likelihood and confidence in the IPCC uncertainty assessment process. Climatic Change, 85 (1-2), 19-31.

[2] Mastrandrea, M., C. Field, T. Stocker, O. Edenhofer, K. Ebi, D. Frame, H. Held, E. Kriegler, K. Mach, G. Plattner, G. Yohe, and F. Zwiers 2010: Guidance notes for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties, Available at https://www.ipcc.ch

[3] Risbey, J. and T. O’Kane 2011: Sources of knowledge and ignorance in climate research:  Climatic Change, 108 /4, 755-773,

[4] Leonard Smith and Nicholas Stern 2011, Uncertainty in science and its role in science policy, Phil. Trans. R. Soc. A 369, 1–24.

[5] Sheila Jasanoff 2003, Technologies of Humility:  Citizen Participation in Governing Science, Minerva 41: 223–244.

The War on Cancer…Phobia

untitled.bmpDavid Ropeik is an international consultant in risk perception and risk communication, and an Instructor in the Environmental Management Program at the Harvard University Extension School. He is the author of How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts and principal co-author of RISK A Practical Guide for Deciding What’s Really Safe and What’s Really Dangerous in the World Around You. He writes the blog Risk; Reason and Reality at Big Think.com and also writes for Huffington Post,  Psychology Today,  and Scientific American.

He founded the program “Improving Media Coverage of Risk,” was an award-winning journalist in Boston for 22 years and a Knight Science Journalism Fellow at MIT.

If you were to be diagnosed with cancer, how do you think you would feel? It would depend on the type of cancer of course, but there’s a good chance that no matter the details, the word ‘cancer’ would make the diagnosis much more frightening. Frightening enough, in fact, to do you as much harm, or more, than the disease itself.  There is no question that in many cases, we are Cancer Phobic, more afraid of the disease than the medical evidence says we need to be, and that fear alone can be bad for our health. As much as we need to understand cancer itself, we need to recognize and understand this risk, the risk of Cancer Phobia, in order to avoid all of what this awful disease can do to us.

In a recent report to the U.S. National Institutes of Health (NIH), a panel of leading experts on prostate cancer, the second most common cancer in men (after skin), said;

“Although most prostate cancers are slow growing and unlikely to spread, most men receive immediate treatment with surgery or radiation. These therapeutic strategies are associated with short- and long-term complications including impotence and urinary incontinence.”

“Approximately 10 percent of men who are eligible for observational strategies (keep an eye on it but no immediate need for surgery or radiation) choose this approach.”

“Early results demonstrate disease-free and survival rates that compare favorably (between observation and) curative therapy.”

“Because of the very favorable prognosis of low-risk prostate cancer, strong consideration should be given to removing the anxiety-provoking term ‘cancer’ for this condition.”

Let me sum that up. Many prostate cancers grow so slowly they don’t need to be treated right away…the unnecessary treatment causes significant harm…and one of the reasons nine men out of ten men diagnosed with slow-growing prostate cancer accept, indeed choose these unnecessary harms, is because “cancer” sounds scary.

Consider more evidence for Cancer Phobia. In “Overdiagnosis in Cancer”  doctors at Dartmouth classified “25% of mammographically detected breast cancers, 50% of chest x-ray and/or sputum-detected lung cancers, and 60% of prostate-specific antigen–detected prostate cancers”, as ‘overdiagnosed’, which they defined as “1. The cancer never progresses (or, in fact, regresses) or 2. The cancer progresses slowly enough that the patient dies of other causes before the cancer becomes symptomatic.” The doctors described the negative health effects such patients suffer from a range of treatments that often involve radical surgery and noted; “Although such patients cannot benefit from unnecessary treatment, they can be harmed.”

Add to those harms the damage from stress caused by the diagnosis of cancer, or even the fear of getting it. Chronic stress raises blood pressure and contributes to heart disease. Even more directly as regards cancer, chronic stress weakens the immune system, the very system our bodies need to help prevent, fight, or recover from, the disease itself. and beyond these harms to individual patients, consider the cost of Cancer Phobia at the societal level.

The basic biological mechanics of what causes both cancer and heart disease are still inadequately understood and need fundamental research. But the U.S. National Institutes of Health spend about four times as much on cancer research as on heart disease research, despite the fact that heart disease kills about 10% more people (60,000 each year, 25 per day), than cancer. We are spending far more on the second leading cause of death than we are trying to figure out what is much more likely to kill us.

Despite all the progress we’ve made on cancer, a recent Harris poll found that cancer is the most feared disease in the U.S., 41% to Alzheimer’s 31%. (Only 8% of American are most afraid of the leading cause of death in the U.S., heart disease). In August 2011, Cancer Research UK found 35% of Britons feared cancer most, followed by Alzheimer’s at 25%.And this fear is hardly new. 40 years ago the U.S. National Cancer Act of 1971, which declared “War on Cancer” said “…cancer is the disease which is the major health concern of Americans today.”

Cancer Phobia goes even further back. The term itself was coined in an article by Dr. George Crile, Jr., in Life Magazine, in 1955, “Fear of Cancer and unnecessary operations”. His insights describe conditions  today as accurately as they did then; “Those responsible for telling the public about cancer have chosen the weapon of fear, believing that only through fear can the public be educated. Newspapers and magazines have magnified and spread this fear, knowing that the public is always interested in the melodramatic and the frightening. This has fostered a disease, fear of cancer, a contagious disease that spreads from mouth to ear. It is possible that today, in terms of the total number of people affected, fear of cancer is causing more suffering than cancer itself. This fear leads both doctors and patients to do unreasonable and therefore dangerous things.”

Unfortunately, Dr. Crile Jr. overlooked the key truth about our fear of cancer; Cancer Phobia is hardly just the product of zealous health and environmental advocates magnified by media alarmism. It comes from the innate way we perceive all risks, a process that relies not only the statistical and medical facts, but on how those facts feel. Risk perception is a blend of conscious reasoning and subconscious instinct, and neuroscience suggests that between the two, instincts and emotions have the upper hand. While we’ve been busy studying cancer, we have also learned a lot about the specific psychological characteristics of cancer that make it particularly frightening.

The more pain and suffering a risk involves, like cancer, the scarier it is.

The less control over a risk we feel we have, the scarier it is. Despite great medical progress, cancer is still something that too often can’t be controlled. It is still widely assumed that a diagnosis of cancer is a death sentence.

The more a risk feels imposed on us, rather than the result of something we did by choice, the scarier it is. Many people continue to believe that a majority of cancers are ‘done to us’ by outside forces, despite the medical evidence that environmental cancers (beyond those caused by our lifestyle choices of diet and exercise) make up perhaps 10-15% of all cases.

The greater our ‘mental availability’ about a risk – how readily the risk comes to mind – the scarier it is. Cancer is constantly in the news. And the very mention of the word ‘cancer’ is instantly overwhelmingly negative, a psychological effect called Stigmatization that makes it difficult for us to think about things objectively.

“Cancer” is no longer the automatic death sentence it was once feared to be. From 1990 to 2010 the overall death rate from cancer in the U.S.has dropped 22% in men and 14% in women.  (Incidence in the U.S.has stayed about the same.) In the U.K., the male mortality rate has dropped 26% and the female rate has declined 16% since 1980, (even while the incidence rate in the UK have increased 22%).

We have learned an immense amount about cancer, allowing us to treat, or even prevent, some types that used to be fatal. But we have also learned a great deal about the psychology of risk perception and why our fears often don’t match the evidence. We are failing to use that knowledge to protect ourselves from the significant, tangible health risks of our innately subjective risk perception system. The proposal of the NIH panel to replace the “C” word with something else that is medically honest but emotionally less frightening, is a tiny first step in the right direction, to open a new front in the War on Cancer, the battle against Cancer Phobia.

The rise of anomalistic psychology – and the fall of parapsychology?

Professor Chris French is the Head of the  Anomalistic Psychology Research Unit  in the Psychology Department at Goldsmiths, University of London.  He is also a Fellow of the British Psychological Society and of the Committee for Skeptical Inquiry and a member of the Scientific Advisory Board of the British False Memory Society.  His main current area of research is the psychology of paranormal beliefs and anomalous experiences. He frequently appears in the media casting a skeptical eye over paranormal claims. He edited The Skeptic magazine for more than a decade and sometimes writes for the Guardian’s online science pages. 

Ever since records began, people have reported strange experiences that appear to contradict our conventional scientific understanding of the universe. These have included reports that appear to support the possibility of life after death, such as near-death experiences, ghostly encounters and apparent communication with the dead, as well as claims by various individuals that they possessed mysterious powers such as the ability to read minds, see into the future, obtain information from remote locations without the use of the known sensory channels, or to move objects by willpower alone.  Such accounts are accepted as veridical by most of the world’s population in one form or another and claims relating to miraculous healing, alien abduction, astrological prediction and the power of crystals are also accepted by many.  Belief in such paranormal claims is clearly an important aspect of the human condition. What are we to make of such accounts from a scientific perspective?

Should we accept at least some of these claims more or less at face value? That is to say, should we accept that extrasensory perception (ESP), psychokinesis (PK), and life after death are all real? Parapsychologists have systematically investigated such phenomena for around 130 years but have so far failed to convince the wider scientific community that this is the case. The eminent scientists and intellectuals who founded the Society for Psychical Research in 1882 were convinced that, with the tools of science at their disposal, they would settle the issue one way or another within a few years. Clearly, that has not happened. Instead, parapsychology has been characterised by a series of ‘false dawns’ during which it has been declared that at last a technique has been developed which can reliably show under well-controlled conditions that paranormal effects are real. With time, however, the technique falls out of favour as subsequent research fails to replicate the initially reported effects and methodological shortcomings become apparent.

The latest candidate for such a ‘false dawn’ is a series of relatively straightforward experiments reported by Daryl Bem in the prestigious Journal of Personality and Social Psychology.  In eight of nine experiments, involving more than a thousand participants in total, Bem reported significant results suggesting that human beings are able in some way to sense events before they happen. For example, the study which produced the largest effect size appeared to show that participants are able to recall more words if they rehearse them than if they do not – even if the rehearsal does not take place until after recall has been tested! As so often happens, these controversial findings received widespread coverage in the mainstream science media. However, subsequent attempts at replication have failed, including a study involving three independent replication attempts carried by Richard Wiseman  (University of Hertfordshire), Stuart Ritchie (University of Edinburgh), and myself (Goldsmiths, University of London).

If paranormal forces really do not exist, how are we to explain the widespread belief in them and the sizeable minority of the population who claim to have had direct personal experience of paranormal phenomena? One possible answer is that there are certain events and experiences which may appear to involve paranormal phenomena but which can in fact be fully explained in non-paranormal, usually psychological, terms. This is the approach adopted by anomalistic psychologists. In general, anomalistic psychologists attempt to explain such phenomena in terms of known psychological effects such as hallucinations, false memories, the unreliability of eyewitness testimony, placebo effects, suggestibility, reasoning biases and so on. It is noteworthy that anomalistic psychologists have, in just a few decades, produced many examples of replicable effects that adequately explain a range of ostensibly paranormal phenomena.

Anomalistic psychology is definitely on the rise. Not only is it now offered as an option on many psychology degree programmes, it is also an option on the most popular A2 psychology syllabus in the UK.  Every year more books and papers in high quality journals are published in this area and more conferences and symposia relating to topics within anomalistic psychology are held. There is no doubt that anomalistic psychology is flourishing.

And what of parapsychology? The health of this discipline is somewhat harder to assess but apart from the occasional ray of hope offered by the latest false dawn, the situation does not look encouraging for parapsychologists. Funding for such research is inevitably more difficult to obtain in times of economic uncertainty. Scarce research funding will be invested in areas where the probability of success is high – and the history of parapsychology shows all too clearly that studies in this area often involve huge investments of time and resources and produce nothing in return. Without a genuine breakthrough in the near future, can parapsychology survive for much longer? Without psychic powers, it’s difficult to know but I certainly would not bet on it.

It’s extraordinary to make discoveries about the universe…

brooks.bmp

This week’s guest post features an interview with Michael Brooks. As well as holding a PhD in quantum physics, Michael is an author, journalist and broadcaster. He’s a consultant to New Scientist, has a weekly column for the New Statesman, and is the author of the bestseller in non-fiction titled ‘13 Things That Don’t Make Sense’. As part of an ongoing cycle of lectures, the City of Arts and Sciences in Valencia, Spain, together with the British Council, recently invited Michael Brooks, to explain the simple question of the origins of the universe.

Nicolas Jackson, from North by Southwest, a partnership between National Radio of Spain (RNE) and the British Council, caught up with Michael Brooks on the occasion.

For a quick taster, here are a few snippets from Michael’s interview, but you can listen to the full interview in the podcast at the end of the post.

Q When did humans first begin to take an interest in discovering the origins of the universe?

Michael Brooks It’s a really interesting phenomenon that today, in 2011, we think of there being an origin to the universe or a beginning, because actually that’s a relatively new idea. It wasn’t really put out there till the 1920s by a Belgian catholic priest called Georges Lemaître. He came up with this idea of a day without yesterday, and there was a kind of firestorm, fireworks and suddenly, what he called the primeval atom, kind of exploded… and from this came the universe.

And… he kind of put this out in the late 1920s, and when Einstein heard about it in 1933, he said: “This is the most beautiful idea I’ve ever heard of”. In the meantime Edwin Hubble, the astronomer, had been gathering data that showed that most of the galaxies that surround us are moving away from us very fast, and if you wind that back, that implies that somehow they were all together in one place at the same time, which we would consider to be the beginning of the universe.

This seems like a common-sense idea to us now, actually it wasn’t accepted until the 1960s; it did 30 years in the cold and there were various debates over whether the universe had always existed. You couldn’t say anything about a beginning until we discovered the cosmic microwave background radiation, which was the echo of the Big Bang, and proved that there was some kind of cosmic explosion, like Lemaître had said. And that was the point at which we just dropped the idea of there being a steady state, always existing universe, and decided that there had to have been a beginning of everything.

Q Might the idea of the origins of the universe be challenging for certain religious sects in the same way that Darwin’s Origin of the Species has been?

Michael Brooks It’s very important to realise that scientists aren’t deliberately undermining people of faith and religious ideas. What they are doing is looking out into the cosmos and finding evidence for this and for that, and with that evidence we adjust our ideas – of course with Galileo we adjusted our ideas about whether the earth was at the centre of the universe. Based on the evidence we had to change that to having the sun as the centre of the solar system and the earth spinning around it.

Now, there is some backlash against this, particularly in the United States, where people want to only deal in terms of what their faith tells them to believe, or what their religious leaders tell them to believe. Science is no respecter of that really, in many ways, science comes in and says, “this is just what the evidence says, and this is what our experiments tell us,” or, “this is what we uncover in the fossil record.” I don’t think there is a deliberate attempt to create trouble; it’s certainly not an attempt to undermine some of the other benefits of faith communities and everything else. I think it’s just that there are historically always areas where science just treads on the toes of people who hold religious faiths, and whereas science doesn’t really kind of pull any punches, the religious people, the religious leaders have to bend and accommodate the new scientific understanding. So this is always going to happen, I think.

Q Scientific discovery is obviously accelerated massively in the last hundred years. How much more is there for mankind to discover?

Michael Brooks Science is actually very humble in a sense, in that we’ve had 400 years of discovery, and cosmology has uncovered the history of the universe – 13.7 billion years old. But at the same time we realise how little we know, and we’ve discovered that 96% of the universe is in some form that we don’t understand, 72% is dark energy, a mysterious force that seems to be pushing on the very fabric of the universe, and 24% is dark matter, the stuff that exists out there, we know it must be there, or we think it must be there, or our calculations say it must be there. And we then have to work out what it is and look for it, and we’ve actually been looking for it properly for about 40 years now and still not found any clue about where it might be, or what kind of particles these might be.

So it keeps us humble, in a sense inside, and that’s one of the great things, [that] for every discovery that we make, there seem to be about ten more unanswered questions coming. And I think that’s one of the beauties of science, that it never seems to end, it seems to provoke more and more curiosity and questions.

Q You and the City of Arts and Sciences in Valencia coincide in their desire to bring science closer to ordinary people and to make it accessible. Many people might see this as the exact opposite of the arts, where great art is not always meant to explain itself. Why is this?

Michael Brooks I think science takes the trouble because some of the concepts that we deal in are so abstract and so difficult to grasp. You can look at a painting and appreciate a painting without really knowing an awful lot about who painted it, or why, or what they were trying to get across, and you get this aesthetic beauty. Whereas some of the aesthetic beauty in science lies in very complicated equations, or in complicated ideas about, for instance, the beginning of the universe.

And so scientists are really taking it upon themselves to explain. And also there is a passion as well, about what we’ve discovered. It’s an extraordinary thing to be able to discover these things about the universe and how they work. So it’s very rewarding in and of itself to actually explain these to people and see their faces light up.

So maybe some of the arts, certainly painting and writing, people can take it in at whatever level they want to take it in at. So they don’t need so much kind of advocacy, they don’t need so much explanation and communication, whereas science is actually quite inaccessible until somebody is there acting as a bridge between the scientific community and the general public.

Podcast

North by Southwest 50 – Michael Brooks at The City of Arts and Sciences by British Council

North by Southwest is an English-language radio programme giving a taste of British and international culture and arts in Spain and also explores social, scientific and educational issues. North By Southwest is broadcast every week on RNE’s Radio Exterior (World Service) as part of its English-language programming.

Facts and figures – treat with caution

David.bmp

Dr David Barlow is Consultant in Genitourinary Medicine at St Thomas’ and Guy’s Hospitals, London. He has been the lead author for the chapter on gonorrhoea in the last three editions of the Oxford Textbook of Medicine. Between 1986 and 1993, at St Thomas’, he ran the largest linked HIV sero-survey in the United Kingdom. The third edition of his book Sexually Transmitted Infections- The Facts, Oxford University Press, with original cartoons by the late Geoffrey Dickinson, was published in March 2011.

There is something slightly uncomfortable about authoring a book whose cover proclaims: “XXX – The Facts”, with a sub-heading “All the information you need, straight from the experts”. Such is the house style of the OUP for its medical ‘Facts’ series, currently some 35 strong, but going forth and multiplying as you read.

Anyway, it got me thinking about how, in my specialty, when ‘facts’ become ‘figures’, caution is called for. I had an interest in heterosexual transmission of HIV in the 1980s and 1990s which put me in conflict with the official number-crunchers and I’m afraid I’m still suspicious when presented with totals. At the final proof stage of my ‘Facts’ book, I checked the Health Protection Agency’s website for the numbers of UK STIs reported for 2008. Unmentionable diseases including syphilis, gonorrhoea, warts and herpes remained as I had written. Total chlamydial infections, however, had changed from 126,882 (accessed July 2010) to 217,570 (accessed January 2011). A small adjustment might be reasonable. But 70%? This was a DB Type 4 numerical error.

DB’s numerical errors: Types 1-5

Type 1 Somebody has a vested interest: “If we tell these clap-doctors that laboratory culture of the gonococcus is only 70% sensitive, they’ll shut their lab’ and buy our ‘totally sensitive’ NAAT.”

Type 2 The totals may be correct but are misleading (1): “It is Government/Department (of Health) policy to pretend that there is a rapidly increasing HIV epidemic in heterosexuals who are transmitting within the UK.”

Type 3 The totals may be correct but are misleading (2): There is a genuine, probably innocent, misinterpretation of the figures (see horseradish sauce, below)

Type 4 The totals may be correct but are misleading (3): The explanation is perfectly reasonable and logical, but the calculation is opaque/we are keeping it to ourselves/forgot to tell you/have you read the small print?

Type 5 The totals are incorrect: Woops!

At the beginning of June, I awoke to BBC headlines about a doubling of UK-acquired HIV between 2001 and 2010. This drew me to the HPA’s website where I found a press release (June 6): ‘Last year there were 3,800 people diagnosed with HIV who acquired the infection in the UK, not aboard [sic], and this number has doubled over the past decade.’ From the same site: ‘… HIV diagnoses among heterosexuals who most likely acquired in the UK have risen in recent years from 210 in 1999 to 1,150 in 2010’. I shall return to these later but if you really have nothing better to do, why not see whether you can confirm the figures quoted above by accessing the HPA’s ‘New HIV Diagnosis,’ Table 5 here. And your next task (5 marks) is to re-word the press release…

Exactly thirty years ago, on 5th June 1981, the sleuths at the Centers for Disease Control published their crafty bit of epidemiology entitled ‘Pneumocystis pneumonia – Los Angeles’. The CDC had picked up an increase, from the West Coast, in requests for pentamidine. This was the drug used to treat PCP, a rare lung infection found in renal transplant patients whose immunity had been weakened (deliberately) to reduce rejection.

These new cases were different. The men were immuno-compromised but none were undergoing transplantation and all were gay. Thus were HIV and AIDS (although not so named for a year or two) introduced to an awe-struck, and soon fear-stricken, public.

Britain had its first AIDS case in 1981 and in August 1982 the Communicable Disease Surveillance Centre (the UK’s CDC) published the first of their monthly updates in the Communicable Disease Report, the CDR. The risk categories were divided into homosexual, haemophiliac, blood transfusion, intravenous drug users and heterosexuals [without other risk]. It was with these heterosexual cases that the distinction between ‘the truth’ and ‘the whole truth’ became lost during late 1986.

The May 1986 CDR tables broke down the heterosexual AIDS cases into: 3 with USA/Caribbean connection, 3 simply ‘heterosexual contact’ (of whom two “…had recently returned from Uganda and Mozambique.”), and 12 associated with sub-Saharan Africa. In October this connection became a footnote: “associated with sub-Saharan Africa” and by November, the categories had become: ‘contact UK’ and ‘contact abroad’. The December, separate, HIV figures were reported, without footnote, simply as ‘heterosexuals’ ( Type 2 numerical error ). Africa had disappeared from the tables.

By one of those coincidences loved by cynics and conspiracy theorists, the UK-wide leaflet drop about AIDS occurred in January 1987, the very next month, to be followed, in February, by the ’_Don’t die of ignorance_’ campaign. The national press then published increasingly doom-laden descriptions, largely unchallenged, of the burgeoning UK AIDS epidemic in heterosexuals.

What actually mattered was the number of cases being transmitted in Great Britain. Was the disease spreading? What was the risk from a bonk?

The change in wording of the heterosexual categories in the late 1980s allowed speculation that the ‘infected abroad’ category was largely made up of British nationals who had gone overseas and returned with HIV/AIDS. This was the CDR’s interpretation when they gave advice to travellers in 1991 ( Type 3 numerical error ).

We published an alternative view in the Lancet (CDR did not print correspondence, commentary or criticism) and the CDSC, unusually given the chance to reply in the same edition, graciously and politely acknowledged our figures from St Thomas’ but said that they were not representative. Neither my first nor last experience as an outlier.

Have you ever made horseradish sauce? Epidemiologists and cookery-writers run similar risks. Counting and cooking need to be in their respective repertoires but, for both, the craft improves with hands-on experience: contact with patients, or trying out the recipe. If your cookbook doesn’t mention wearing goggles with the wind behind you while you grate this vicious root (and most don’t), the author has never made the sauce. Epidemiologists don’t need the formula for horseradish peroxidase either, but they may miss an open goal if they don’t see patients.

Four other hospitals in or near London (I confess to prompting) reported that most of their (no other risk) HIV-positive heterosexuals were, like ours, from Africa, (Outliers 5, Regression Lines 0). It was not until later in the 1990s that the CDSC accepted the UK heterosexual HIV/AIDS epidemic to be largely imported, with little evidence of significant transmission between heterosexuals from, or in, this country.

So, how did you get on with the HPA’s table 5? You found the 210 for 1999 easily enough, I’m sure. But the 1,150 (and 3,800) for 2010? Well, a helpful person in the HPA’s epidemiology section told me they reached this figure by extrapolating the, as yet, uncategorized (‘not reported’ – penultimate row Table 5) cases in the same proportion as the different categories where the region of infection was actually known ( Type 4 ).

“But you didn’t apply that correction to the 210 in 1999”.

“Ah, no. We didn’t!” ( Type 5 ).

And, finally, the Type 1 numerical error? Specificity is also important in diagnostic tests (the 55 year-old Granny who went to her GP for a smear test, was screened for chlamydia, and came out with gonorrhoea. Yes truly!). The Nucleic Acid Amplification Tests for gonorrhoea may give you false positives.

Why didn’t you tell me about this before, Mother?

box.bmp

So, am I advising less sex?

What, and put myself out of a job? Give over!

References

Barlow D (2004) HIV/AIDS in ethnic minorities in the United Kingdom.

In Ethnicity and HIV: prevention and care in Europe and the USA, Eds, Erwin, Smith and Peters. 21-46

Barlow D, Daker-White G and Band B (1997) Assortative mixing in a heterosexual clinic population – a limiting factor in HIV spread? AIDS; 11:1039-44

Science owes much to both Christianity and the Middle Ages

James.JPGThis week’s guest blogger is James Hannam, he has a PhD in the History and Philosophy of Science from the University of Cambridge and is the author of The Genesis of Science: How the Christian Middle Ages Launched the Scientific Revolution (published in the UK as God’s Philosophers: How the Medieval World Laid the Foundations of Modern Science).

The award of the Templeton Prize to the retired president of the Royal Society, Martin Rees, has reawakened the controversy over science and religion. I have had the pleasure of meeting Lord Rees a couple of times, including when my book God’s Philosophers (newly released in the US as The Genesis of Science) was shortlisted for the Royal Society science book prize. I doubt he has welcomed the fuss over the Templeton Foundation, but neither will he be particularly perturbed by it.

the genesis of science.JPGFew topics are as open to misunderstanding as the relationship between faith and reason. The ongoing clash of creationism with evolution obscures the fact that Christianity has actually had a far more positive role to play in the history of science than commonly believed. Indeed, many of the alleged examples of religion holding back scientific progress turn out to be bogus. For instance, the Church has never taught that the Earth is flat and, in the Middle Ages, no one thought so anyway. Popes haven’t tried to ban zero, human dissection or lightening rods, let alone excommunicate Halley’s Comet. No one, I am pleased to say, was ever burnt at the stake for scientific ideas. Yet, all these stories are still regularly trotted out as examples of clerical intransigence in the face of scientific progress.

Admittedly, Galileo was put on trial for claiming it is a fact that the Earth goes around the sun, rather than just a hypothesis as the Catholic Church demanded. Still, historians have found that even his trial was as much a case of papal egotism as scientific conservatism. It hardly deserves to overshadow all the support that the Church has given to scientific investigation over the centuries.

That support took several forms. One was simply financial. Until the French Revolution, the Catholic Church was the leading sponsor of scientific research. Starting in the Middle Ages, it paid for priests, monks and friars to study at the universities. The church even insisted that science and mathematics should be a compulsory part of the syllabus. And after some debate, it accepted that Greek and Arabic natural philosophy were essential tools for defending the faith. By the seventeenth century, the Jesuit order had become the leading scientific organisation in Europe, publishing thousands of papers and spreading new discoveries around the world. The cathedrals themselves were designed to double up as astronomical observatories to allow ever more accurate determination of the calendar. And of course, modern genetics was founded by a future abbot growing peas in the monastic garden.

god designing uni.bmpBut religious support for science took deeper forms as well. It was only during the nineteenth century that science began to have any practical applications. Technology had ploughed its own furrow up until the 1830s when the German chemical industry started to employ their first PhDs. Before then, the only reason to study science was curiosity or religious piety. Christians believed that God created the universe and ordained the laws of nature. To study the natural world was to admire the work of God. This could be a religious duty and inspire science when there were few other reasons to bother with it. It was faith that led Copernicus to reject the ugly Ptolemaic universe; that drove Johannes Kepler to discover the constitution of the solar system; and that convinced James Clerk Maxwell he could reduce electromagnetism to a set of equations so elegant they take the breathe away.

Given that the Church has not been an enemy to science, it is less surprising to find that the era which was most dominated by Christian faith, the Middle Ages, was a time of innovation and progress. Inventions like the mechanical clock, glasses, printing and accountancy all burst onto the scene in the late medieval period. In the field of physics, scholars have now found medieval theories about accelerated motion, the rotation of the earth and inertia embedded in the works of Copernicus and Galileo. Even the so-called “dark ages” from 500AD to 1000AD were actually a time of advance after the trough that followed the fall of Rome. Agricultural productivity soared with the use of heavy ploughs, horse collars, crop rotation and watermills, leading to a rapid increase in population.

It was only during the “enlightenment” that the idea took root that Christianity had been a serious impediment to science. Voltaire and his fellow philosophes opposed the Catholic Church because of its close association with France’s absolute monarchy. Accusing clerics of holding back scientific development was a safe way to make a political point. The cudgels were later taken up by TH Huxley, Darwin’s bulldog, in his struggle to free English science from any sort of clerical influence. Creationism did the rest of the job of persuading the public that Christianity and science are doomed to perpetual antagonism.

Nonetheless, today, science and religion are the two most powerful intellectual forces on the planet. Both are capable of doing enormous good, but their chances of doing so are much greater if they can work together. The award of the Templeton Prize to Lord Rees is a small step in the right direction.

The Genesis of Science: How the Christian Middle Ages Launched the Scientific Revolution is available now.

Shortlisted for the Royal Society Science Book Prize

Well-researched and hugely enjoyable.” New Scientist

“A spirited jaunt through centuries of scientific development… captures the wonder of the medieval world: its inspirational curiosity and its engaging strangeness.” Sunday Times

“This book contains much valuable material summarised with commendable no-nonsense clarity… James Hannam has done a fine job of knocking down an old caricature.” Sunday Telegraph

Risk perception

untitled.bmp

David Ropeik is an international consultant in risk perception and risk communication, and an Instructor in the Environmental Management Program at the Harvard University Extension School. He is the author of How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts, principal co-author of RISK A Practical Guide for Deciding What’s Really Safe and What’s Really Dangerous in the World Around You, and blogs for Huffington Post, Psychology Today, and has written guest blogs for Scientific American, Climate Central, and Big Think. He founded the programImproving Media Coverage of Risk,” was an award-winning journalist in Boston for 22 years and a Knight Science Journalism Fellow at MIT.

You are reading a piece in Nature, so you are probably fairly well-educated, and there is a better than even chance that you fancy yourself a fact-based thinker and reasonably rational. Meaning no disrespect, but that assumption is fanciful, at least when it comes to the perception of risk. Ambrose Bierce was right when he defined the brain as “the organ with which we think we think.” Research from diverse fields, and countless examples from the real world, have convincingly established that our perceptions of risk are an inextricable blend of fact and feeling, reason and gut reaction, cognition and intuition. No matter what the hard risk sciences may tell us the facts are about a risk, the social sciences tell us that our interpretation of those facts is ultimately subjective.

While this system has done a good job getting us this far along evolution’s winding road, it also gets us into trouble because sometimes, no matter how right our perceptions feel, we get risk wrong. We worry about some things more than the evidence warrants (vaccines, nuclear radiation, genetically modified food), and less about some threats than the evidence warns (climate change, obesity, using our mobiles when we drive). That produces what I have labeled The Perception Gap, the gap between our fears and the facts, which is a huge risk in and of itself.

The Perception Gap produces dangerous personal choices that hurt us and those around us (declining vaccination rates are fueling the resurgence of nearly eradicated diseases). It causes the profound health harms of chronic stress (for those who worry more than necessary). And it produces social policies that protect us more from what we’re afraid of than from what in fact threatens us the most (we spend more to protect ourselves from terrorism than heart disease)…which in effect raises our overall risk.

We do have to fear fear itself…too much or too little. So we need to understand how our subjective system of risk perception works, in order to recognize and avoid its pitfalls. Surprisingly, few people are aware of how much we know about this system. (I’ve tried to summarize that knowledge in my book, How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts). Here’s a mad dash through the literature on risk perception;

• Neuroscience by Joseph LeDoux et.al. has discovered neural pathways that insure that we respond initially to risky stimuli subconsciously/instinctively, before cognition kicks in. And in the ongoing risk response that follows, the wiring and chemistry of the brain also insure that instinct and affect (feelings) play a significant role, sometimes the primary role, in how we perceive and respond to danger. Simplistically, the brain is designed to subconsciously feel first and consciously think second, and to feel more and think less.

• The research of Daniel Kahneman et.al. has discovered a mental toolbox (as Gird Gigerenzer puts it) of heuristics and biases we use to quickly make sense of partial information and turn a few facts into the full picture of our judgment. These mental shortcuts occur subconsciously, outside (and often before) conscious reasoning. This research further confirms that we are far more Homo Naturalis than Homo Rationalis.

• The Psychometric Paradigm research of Paul Slovic et.al. has revealed a suite of psychological characteristics that make risks feel “more” frightening, or less, the facts notwithstanding. These ‘risk perception factors’ include:

untitled.bmp

• Recent research on the theory of Cultural Cognition by Dan Kahan et.al has found that our views on risks are shaped to agree with those we most strongly identify with, based on our group’s underlying feelings about how society should operate. We fall into four general groups about the sort of social organization we prefer, defined along two continua, represented as a grid. We all fall somewhere along these two continua, depending on the issue.

untitled1.bmp

Individualists prefer a society that maximizes the individual’s control over his or her life. Communitarians prefer a society in which the collective group is mire actively engaged in making the rules and solving society’s problems (Individualists deny environmental problems like climate change because such problems require a ’we’re all in this together’ communal response. Communitarians see climate change as a huge threat in part because it requires a social response). Along the other continuum, Hierarchists prefer a society with rigid structure and class and a stable predictable status quo, while Egalitarians prefer a society that is more flexible, that allows more social and economic mobility, and is less constrained by ‘the way it’s always been’. (Hierarchists deny climate change because they fear the response means shaking up the free market-fossil fuel status quo. Shaking up the status quo is music to the ears of Egalitarians, who are therefore more likely to believe in climate change.)

That risk is inescapably subjective is disconcerting for those who place their faith in the ultimate power of Pure Cartesian “I think, therefore I am” Reason. But the robust evidence summarized above makes clear that;

1. Risk perception is inescapably subjective

2. No matter how well educated or informed we may be, we will sometimes get risk wrong, producing a host of profound harms.

3. In the interest of public and environmental health, we need a more holistic, and more realistic, approach to what risk means. Societal risk management has to recognize the risk of risk misperception, the risk that arises when our fears don’t match the evidence, the risks of The Perception Gap.

Letting go of our naïve fealty to perfect reason will allow us to recognize and understand these hidden dangers. Once brought to light, the harms to society from declining vaccination rates, the lost benefits of genetically modified food, the morbidity and mortality and societal costs of obesity – these risks and many more can be studied and quantified and managed with the same tools we already use to manage the risks from pollution or crime or disease. The challenge is not how to manage the risks of the Perception Gap. The challenge is to rationally let go of our irrational belief in the mythical God of Perfect Reason, and use what we know about the psychology of risk perception to more rationally manage the risks that arise when our subjective risk perception system gets things dangerously wrong.

Further Reading:

The neuroscience of risk perception – LeDoux J, The Emotional Brain, Simon and Schuster, 1996

Heuristics and Biases – Kahneman, D., Slovic, P. & Tversky, A. Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, 1982)

The Psychometric Paradigm ‘risk perception factors’ – Slovic P, The Perception of Risk, Earthscan 2000

Cultural Cognition.

It just doesn’t feel right

simon.bmpThis week’s guest blogger is Simon Laham, PhD, a social psychologist and a Research Fellow and Lecturer in Psychological Sciences at the University of Melbourne, Australia. His work focuses on the psychology of morality.

Matthew is playing with his new kitten late one night. He is wearing only his boxer shorts, and the kitten sometimes walks over his genitals. Eventually, this arouses him and he begins to rub his bare genitals along the kitten’s body. The kitten purrs and seems to enjoy the contact.

What do you think about this? Morally right or wrong? Well, if you’re like most, you think that Matthew’s behavior is not only pretty disgusting, but morally condemnable.

But now ask yourself why you think it’s wrong? No one is harmed here, after all; Matthew is having fun and it seems that the kitten isn’t too bothered. What about germs? Well, let’s say that the kitten has just been bathed and there is no chance of Matthew catching something. Still wrong?

When psychologist Jonathan Haidt presented participants in one of his studies with scenarios just like this (depicting harmless, but norm-violating behaviors, such as masturbating with frozen chickens and eating road kill), he found that many people relentlessly insisted that such behaviours were “just wrong,” even though they couldn’t muster any convincing justifications. These participants sat, “morally dumbfounded,” as Haidt put it, asserting simply that “it just feels wrong.”

When prodded, people’s moral foundations tend to wobble a little bit. Although many of us like to think that our moralities are firmly grounded in principles – thou shalt not kill, love thy neighbour as thyself – and that moral judgments spring from the logical application of such principles, it just so happens that many of our moral judgments aren’t driven by the rational, deliberative contemplation of moral rules at all. Rather they are driven by intuitions. We witness an action, experience an intuitive flash of disgust, or anger, for example, and, as a result, deem the action morally wrong. Matthew isn’t violating any lofty moral law with his kitten rubbing, he’s just doing something disgusting, and, thus, wrong.

Just where do these intuitions come from? It’s quite likely that they have an evolutionary basis. Put simply, we feel disgusted or angry about behaviors that somehow compromised the reproductive success of our evolutionary ancestors.

Take incest as an example. Those ancestors of ours who happened to have felt disgust at incest would have been less likely to commit it, and thus more likely to have produced viable offspring, passing on their incest-condemning genes to future generations. Certain moral intuitions conferred reproductive advantages in the past; those are the moral intuitions we feel today.

It’s quite sobering to realise that your moral outlook is shaped not by appeal to higher reason, but by the contingencies of your evolutionary history. Still more sobering, however, are results from other research which suggests that opinions about important moral questions are influenced by a raft of other, thoroughly irrelevant factors.

Consider this: if I had happened to write the Matthew scenario above in chiller font or blackadder ITC font or some other difficult to read font, chances are you would have found it even more morally wrong than you did originally. Some work from my own lab shows that when people have a difficult time processing a stimulus (because, for example, it’s hard to read), they are more likely to think it’s morally wrong than if they have an easy time processing it. The idea here is that “disfluent” processing feels negative, and this negativity seeps into our moral judgments, making us harsher moral critics.

Or consider this question: What entities in the world deserve our moral consideration? Apes? Dogs? Fetuses? This is not a trivial question. Your answers will form the basis of your attitudes towards vegetarianism, abortion, or animal experimentation, among other pressing moral issues. Yet even here we see the subtle influence of moral irrelevancies. When people are asked to generate a list of such morally worthy entities by selecting candidates from a longer list, they end up with fewer candidates than people asked to cross unworthy entities off a longer list. The size of your moral community, in other words, depends on how you happen to be asked to populate it.

The list of subtle shapers of moral judgment goes on: show people a clip from Saturday Night Live and they are more likely to make utilitarian judgments; have them make judgments in a dirty room, littered with used tissues and pizza boxes, and they become harsher moral judges; expose people to “fart spray” and they less likely to endorse marriage between first cousins…

It should give you pause to realize that your judgments of right and wrong – be they about euthanasia, incest, abortion, or kitten masturbation – are subject to a range on non-rational, gut feelings or intuitions rather than under the control deliberative, rational, reasoning processes. The belief that our moral compasses are guided by a set of well thought-out principles that we consciously and painstakingly apply to each new situation is simply inconsistent with the empirical evidence. This belief fails to capture the complexity of moral judgment and it ignores the now well-documented fact that our judgments of right and wrong are driven largely by intuitive and often irrelevant factors that reside largely outside of our awareness.