The Poisoner’s Guide to Story Telling

Deborah Blum is a Pulitzer-Prize winning science writer and the author of five books, most recently The New York Times best seller, The Poisoner’s Handbook: Murder and the Birth of Forensic Medicine in Jazz Age New York.  She also writes for publications ranging from The Wall Street Journal to Lapham’s Quarterly and blogs about chemistry, culture (and the occasional murder) for the Public Library of Science at Speakeasy Science, blogs.plos.org/speakeasyscience.  She is the Helen Firstbrook Franklin Professor of Journalism at University of Wisconsin-Madison.

So let me tell you the story of a suspected murder, a real one, the irresistibly tragic tale of a beautiful young actress of early 20th century Hollywood, the adventure-loving heroine of one successful film after another: Madcap Madge, The Flapper, and – what would turn out to be her last picture – Everybody’s Sweetheart.

The actress, Olive Thomas, had the look of a charming child, with a shining bob of dark, curly hair, big violet-blue eyes, and a pale, heart-shaped face. It was a look that launched her career, starting in 1914 when she’d won a “Most Beautiful Girl in New York City contest.” And it launched her marriage to a member of Hollywood’s inner circle, Jack Pickford – younger brother of screen star Mary Pickford.

The couple rapidly developed a reputation for wild behavior, intense partying, intense quarreling, usually over his numerous side affairs – he’d developed syphilis as a result of one of them.  They separated, reunited, separated, tried again, delighting the gossip magazines. In early September 1920, the couple flew to Paris, reportedly on a reconciliation holiday. They checked into the Hotel Ritz and whirled off to enjoy time in a Prohibition-free city. At the end one particularly drunken spree, Pickford and Thomas staggered into their hotel room at nearly three in the morning.  As Pickford told the police, he was floating in a whiskeyed haze, when Olive began screaming, over and over, “Oh my God, my God.”

He stumbled into the dimly lit bathroom, where she was leaning against the counter. Mistaking it for her sleeping medicine, she had picked up a bottle of the bichloride of mercury potion that he rubbed on the painful sores caused by syphilis, poured a dose, and chugged it down. As the corrosive sublimate burned down her throat, she had a moment to realize her mistake. He caught her up and carried her back to the bed, grabbing the phone and calling for an ambulance. “Oh my God,” she repeated, “I’m poisoned.”

And it’s at this point, that I hope I’ve gotten you caught up in the story so that you’ll continue read on as I pause to tell you something about the poisonous element mercury – its history, its chemistry, its use in everything from thermometers to medications, it’s rather insidious poisonous effects. The fact that it’s most dangerous when part of a chemical compound, such as the bichloride of mercury (HgCl2) which makes it far more easily absorbed than in its pure, slippery and self-contained, state.

I might even tell you that by the time of the Olive Thomas test, toxicologists had developed tests sensitive enough to detect the poison in human tissues from exposures as small as .005 of a grain of mercury bichloride. And I’d even tell you about how those tests worked, the way chemists would use a deft mixture of heat, acid, and vapor to coax mercury from a tissue sample to form a thin gleaming deposit on a copper probe. And all of that would lead you back to the question of how Olive Thomas died and whether it was, as her husband insisted, an accident.

Or at least that’s what I did in my book, The Poisoner’s Handbook: Murder and the Birth of Forensic Medicine in Jazz Age New York, although at far greater length. The idea is to weave science through the story well enough that it’s just part of the story. A little devious, you’re thinking. I do tell stories of science more directly, for instance in a story on chemical communication in Scientific American last fall, or in my blog on chemistry and culture, Speakeasy Science. But in my last book, I wasn’t thinking so much about the already science literate audience. I was wondering about the outer circle, about whether  if I could spin a good enough story that people who don’t love chemistry at all would read the book anyway.

In fact, as my non-fiction story is set in the time period that the mystery novelist Agatha Christie started her career (debuting in 1920 with a strychnine-focused tale, The Mysterious Affair at Styles); it was Christie who I hoped to channel when writing about two crusading scientists in the early days of forensic toxicology.  I won’t tell you I pulled it off perfectly but I can tell you that The Poisoner’s Handbook was a finalist for a non-fiction Agatha award, given to favorite books of murder mystery readers. And that I talked about chemistry and poison at the annual conference of mystery writers, Malice Domestic, last spring. And that I was the only science writer in attendance.

I sometimes think of this more subtle weaving of science into a story as a kind of subversive education.  And I think it matters.  Because the audience, the one beyond the inner circle of the science literate,  matters. If we believe what we say – that science communication is important because it helps us build a community with greater understanding of research  – then we need to be creative in the ways we reach far and wide into that community. We need to care about the science disenfranchised as well as the science savvy. I don’t suggest this is the only goal of science communication or that my approach is right for every story or every book. But I will tell you that I hear from some surprising readers, mostly recently a 5th grade boy. I like connecting with that diverse audience. And that I think experimenting with telling science stories has made me better at what I do.

Or so I hope. But, as they say, enough about me.

The stricken actress lingered in the hospital for three more days after she swallowed bichloride of mercury. And during those days, the newspapers repeated every rumor smoking around them – his infidelities had driven her to suicide; Pickford had wished to get rid of her and tricked his wife into taking the poison; as the days passed, he became more evil, she more saintly. So many people flocked to Thomas’s funeral in Paris that women fainted in the crush and the streets became carpeted with countless hats, knocked off and trampled.

The police launched an investigation, including an autopsy, and concluded that it was, as Pickford had said, just a terrible accident. In an interview, with The Los Angeles Examiner, after his return to California, Pickford couldn’t stop dwelling on how much his wife had wanted to live: “The physicians held out hope for her until the last moment, until they found her kidneys paralyzed. Then they lost hope. But the doctors told me she had fought harder than any patient they ever had.”

Not the happiest of conclusions.  And it was not one that laid to rest all the doubts and whispers about Pickford. It may be one reason why he faded away as a Hollywood star. But then it’s real life with all its imperfections, not a mystery novel. And if my subversive plan worked here you read until I reached that conclusion.

 

Making Hay

jan.bmp

This week’s guest blogger is Jan Zalasiewicz. He is a Senior Lecturer at the Department of Geology, University of Leicester, UK. His main research interests are in palaeoenvironmental change during episodes of Earth history ranging somewhat irregularly from the early Palaeozoic to the present (‘Anthropocene’) time. He has published two popular science books, The Earth After Us and The Planet in a Pebble. He is discussing science at the Hay Festival, tying in nicely with our mini-series on science festivals.

An Oscar is unexpectedly heavy. Given that such a thing is often awarded to actresses who tend to the fragile and gazelle-like, one might imagine that it should be a delicately spun confection. No: it’s solid metal through and through, and a couple of weeks in the gym beforehand, toning up the biceps, should be advised to any potential winners. This particular trophy was on one of the many cluttered desks in the crowded temporary office from which operations at the Hay-on-Wye Book Festival are masterminded; it had been recently won by one of those involved, and it (not he) had a slight dent to the forehead. There had obviously been a good party.

This is one of the many strange and magical things that one can encounter at that remarkable event, as one moves – no, is swept – through its course. I had been asked to give a talk there, describing the life (so to speak) and times (surprisingly various) of a pebble of Welsh slate, having recently written a book on such a thing. It’s one of the occasional treats that writing a popular science book brings with it. For, as almost every author knows, the rewards of book writing do not generally encompass yachts, pet football teams or inflated bank balances – particularly when one is attempting to popularise geology, even in microcosm. Rather, the rewards – other than the pleasure of the writing itself – lie largely in the small adventures that turn up, just now and then, once the book is out there, like a shiny silver sixpence in the solid plum pudding of Life.

Seeing the festival from the view of a writer (or ‘artist’, as the notice-boards endearingly put it) is especially revealing. As an operation, it is simply staggering. Like the apocryphal bumble-bee that shouldn’t fly but does, this multi-dimensional happening gets off the ground and, somehow, stays aloft. The arrivals, appearances and departures of a bewildering number of writers (here, one really does seem to be among a cast of thousands) is calmly (so it appears), efficiently (for sure) and above all amiably navigated, by a large team of employees and volunteers – and that’s even before the legions of bibliophiles come through the gates. Orchestrating the whole lot is Peter Florence, who is clearly very good at this kind of thing. When those painfully convoluted climate negotiations of Kyoto, Copenhagen and Cancun are finally re-cranked, it would be a good idea to put the whole show in his hands. Atmospheric CO2 levels should begin to plummet within the week.

Giving the talk itself was a touch different to giving a standard undergraduate lecture – or even the standard popular lecture of an evening. All the talks are in large tents. When the wind blows strongly (as on that Welsh springtime afternoon), the structure creaks and groans like a storm-tossed galleon. A spotlight illuminates the speaker while the audience is in darkness, so eye contact is mainly in the infra-red spectrum. There was nothing for it, then, but to hold forth in the manner of one of the more despairing tenors of a Wagner epic, and hope for the best. Luckily, the sound system was (of course) more than capably managed. Enough pebble science got through, thankfully, for the questions at the end to be right on the button.

It’s the noises off, though, that make the event. The lightly concussed Oscar was next to a small laptop streaming in the European Cup Final, around which a motley assemblage of volunteers and stray writers was clustered. Even with the rather pointilliste images (despite all the available bandwidth being plundered for the purpose), it was obvious that Barcelona were playing in some different part of the space-time continuum to the gallant Mancunians. For the second half, I was smuggled into (it wasn’t quite gatecrashing, honest) an exclusive party on the other side of the creative tracks. There, amongst other delights, was a very large 3D television screen. This was a new experience to me. Heaven knows what the new technology will do to the flying crockery in the more emotional soaps, but David Villa’s goal soared like a comet.

And, of course, there was lots of conversation, and the meeting of people that normally don’t cross my personal orbit. The Hay Festival is good for science – this year, there were John Barrow, Brian Cox, Martin Rees and many others. But at heart it represents a cross-section of human life and interests that, to a scientist, provides a kind of reality check. Science may have brought about the conditions by which the Earth can support, now, seven billion people. People, however, generally have more immediate concerns than science; more, even, than the state of the Earth that supports them. Most people live very much in the world of people, and within the intricate human networks that seem like a living vindication of the ‘noosphere’ – the global sphere of human thought that Teilhard de Chardin proposed almost a century ago.

This is a sphere crammed with human tragedy, triumph, greed, commerce, low comedy, love and hope and fear – in the tradition of Tolstoy and of Jackie Collins too. By contrast, the world inhabited by, say, the average Earth scientist spans billions of years, encompasses extraordinary volcanoes, deep glaciations and bizarre life-forms. This world is literally inhuman, one nigh well impossible to connect with emotionally: even the most ardent cat-lover would not think to scritch the ear of a sabre-tooth tiger, while the trilobites and armoured fish of deeper times might as well have been on another planet.

The two worlds are interconnected, naturally. We all need a stably functioning Earth. And the Earth, these days, in a sense, needs us – especially given how many of its functions we have appropriated. It needs us collectively, at least, to try to steer the least damaging course consistent with human need. Occasions like Hay might – just – help bridge the gap. And, of course, one can have a hell of a good time there too. Even scientists are human, after all.

If you want to read more highlights from our mini-series on science festivals, you can find a summary of all our coverage here.

Science: A Four Thousand Year History

pat 2.JPG

This weeks guest blogger, Patricia Fara, discusses some problems she faced when deciding how to begin her most recent book, Science: A Four Thousand Year HistoryShe lectures on the history of science at Cambridge University, where she is Senior Tutor of Clare College. Her other successful books include Newton: The Making of Genius (2002), Sex, Botany and Empire (2003) and Pandora’s Breeches: Women, Science and Power in the Enlightenment (2004).

Lewis Carroll knew how difficult it can be to tell a story. ‘Where shall I begin, please your Majesty?’, asked the White Rabbit. Alice listened for the answer. ‘Begin at the beginning,’ the King said, gravely, ‘and go on till you come to the end: then stop.’

To write Science: A Four Thousand Year History, I had to decide when science began. This is no trivial question, but gets right to the heart of what science might be. Looking back at the past, it is possible to pick out ideas and discoveries that later became incorporated within today’s global scientific enterprise. But at the time, they contributed to other goals – finding an auspicious time for religious festivals, winning wars, vindicating biblical prophecies, making a living.

Separating science from superstition is not always easy. When early astronomical observers looked up into the heavens, they saw seven planets circling around the Earth. The Sun and the Moon were the most obvious, but they also identified five others – Saturn, Jupiter, Mars, Venus and Mercury (the next one to be discovered, Uranus, was only spotted at the end of the eighteenth century). Finding planets, and working out how they move across the sky, demands skills that are important for modern science. On the other hand, the first sky-watchers were not primarily interested in how the universe operates, but instead were trying to relate the patterns of the stars to major events on earth, such as famines, floods or the death of a king.

Planets.gif

So it seems wrong to call them scientists. But does it make sense to disparage their conclusions? Modern astronomy rests on a foundation of data collected by expert star-gazers who were also astrologers. Their observations were generally sound, even if their theories have since been rejected. Many scientists find it hard to accept that their own expertise is rooted in beliefs which they dismiss as magic. For those who pledge their faith in progress, magical mumbo-jumbo has been eliminated by scientific reason: magic and science are clearly polar opposites, and any notion that they might share common origins is sacrilegious. But this comforting view is not always easy to reconcile with the historical facts.

newton1-1.JPGConsider Isaac Newton. He believed so firmly in the Greek idea of a harmonic universe that he divided the rainbow into seven colours to correspond with the musical scale. Before then, although opinions varied, artists mostly showed rainbows with four colours. It is, of course, impossible to make any objective decision about the correct number, because the spectrum of visible light varies continuously: there is no sharp cut-off between bands of different colours, so how you think about a rainbow affects how you see it. Be honest – can you tell the difference between blue, indigo and violet?

Since Newton has become an iconic scientific genius, it would seem strange to say that he did not practise science. On the other hand, modern scientists denigrate many of his activities as ridiculous, or even antithetical to science. In addition to his preoccupation with numbers and biblical interpretation, Newton carried out alchemical experiments, poring over ancient texts and careful recording his own thoughts and discoveries. This was no mere hobby: Newton regarded alchemy as a vital route to knowledge and self-improvement, and he incorporated his findings within his astronomical theories.

The example of Newton illustrates how hard it is to pin down exactly when science began. One possibility is to look for the first scientists. But the word scientist was not even invented until 1833, and even then was slow to catch on. Both Michael Faraday and Charles Darwin refused to let themselves be labelled with the new term, but a history that excludes them would seem strange. The most popular starting date is 1543, when Nicolas Copernicus suggested that the Sun and not the Earth lies at the centre of our planetary system. However, there are several objections to this choice, not least that it excludes the Islamic sages whose ideas were so significant in Renaissance Europe, and also the Greeks, whose theories remained influential well into the eighteenth century. So some historians decide choose to begin with the geometer Thales of Miletus, who lived on the Turkish coast around 2500 years ago, and successfully predicted an eclipse. But picking him results in leaving out all of his important predecessors, such as the Egyptians and the Babylonians.

Babylonian 'Queen of teh Night'For Science: A Four Thousand Year History, I decided to start with the Babylonians, whose way of thinking about the universe still affects modern science. Instead of counting in tens and hundreds, they used a base of sixty, which is why there are 360 degrees in a circle. Their complex mathematical techniques and detailed star observations enabled them to predict celestial events – and because their knowledge of the skies was inherited by later observers, it now forms the basis of astronomy as well as structuring everyday life. Thanks to the Babylonians, weeks have seven days, hours have sixty minutes, and minutes have sixty seconds. The next time you look at a digital clock, remember that it has more in common with a clay tablet than you might think.

Story-telling, Statistics, And Other Grave Scientific Insults

In the second of our series of guest blog posts, Aaron Clauset (homepage, blog), Assistant Professor of Computer Science and Member of the Colorado Initiative in Molecular Biotechnology at the University of Colorado at Boulder, discusses the tensions between number- and narrative-based descriptions in science.

The New York Times (and the NYT Magazine) has been running a series of pieces about math, science and society written by John Allen Paulos, a mathematics professor at Temple University and author of several popular books. His latest piece caught my eye because it’s a topic close to my heart: stories vs. statistics. That is, when we seek to explain something1, do we use statistics and quantitative arguments using mainly numbers or do we use stories and narratives featuring actors, motivations and conscious decisions?2 Here are a few good excerpts from Paulos’s latest piece:

…there is a tension between stories and statistics, and one under-appreciated contrast between them is simply the mindset with which we approach them. In listening to stories we tend to suspend disbelief in order to be entertained, whereas in evaluating statistics we generally have an opposite inclination to suspend belief in order not to be beguiled. A drily named distinction from formal statistics is relevant: we’re said to commit a Type I error when we observe something that is not really there and a Type II error when we fail to observe something that is there. There is no way to always avoid both types, and we have different error thresholds in different endeavors, but the type of error people feel more comfortable may be telling…

I’ll close with perhaps the most fundamental tension between stories and statistics. The focus of stories is on individual people rather than averages, on motives rather than movements, on point of view rather than the view from nowhere, context rather than raw data. Moreover, stories are open-ended and metaphorical rather than determinate and literal.

It seems to me that for science, the correct emphasis should be on the statistics. That is, we should be more worried about observing something that is not really there. But as humans, statistics is often too dry and too abstract for us to understand intuitively, to generate that comfortable internal feeling of understanding. Thus, our peers often demand that we give not only the statistical explanation but also a narrative one. Sometimes, this can be tricky because the structure of the two modes of explanation are in fundamental opposition, for instance, if the narrative must include notions of randomness or stochasticity. In such a case, there is no reason for any particular outcome, only reasons for ensembles or patterns of outcomes. The idea that things can happen for no reason is highly counter intuitive3, and yet in the statistical sciences (which is today essentially all sciences), this is often a critical part of the correct explanation4. For the social sciences, I think this is an especially difficult balance to strike because our intuition about how the world works is built up from our own individual-level experiences, while many of the phenomena we care about are patterns above that level, at the group or population levels5.

This is not a new observation and it is not a tension exclusive to the social sciences. For instance, here is Stephen J. Gould (1941-2002), the eminent American paleontologist, speaking about the differences between microevolution and macroevolution (excerpted from Ken McNamara’s “Evolutionary Trends”):

In Flatland, E.A. Abbot’s (1884) classic science-fiction fable about realms of perception, a sphere from the world of three dimensions enters the plane of two-dimensional Flatland (where it is perceived as an expanding circle). In a notable scene, he lifts a Flatlander out of his own world and into the third dimension. Imagine the conceptual reorientation demanded by such an utterly new and higher-order view. I do not suggest that the move from organism to species could be nearly so radical, or so enlightening, but I do fear that we have missed much by over reliance on familiar surroundings.

An instructive analogy might be made, in conclusion, to our successful descent into the world of genes, with resulting insight about the importance of neutralism in evolutionary change. We are organisms and tend to see the world of selection and adaptation as expressed in the good design of wings, legs, and brains. But randomness may predominate in the world of genes—and we might interpret the universe very differently if our primary vantage point resided at this lower level. We might then see a world of largely independent items, drifting in and out by the luck of the draw—but with little islands dotted about here and there, where selection reins in tempo and embryology ties things together. What, then, is the different order of a world still larger than ourselves? If we missed the world of genic neutrality because we are too big, then what are we not seeing because we are too small? We are like genes in some larger world of change among species in the vastness of geological time. What are we missing in trying to read this world by the inappropriate scale of our small bodies and minuscule lifetimes?

To quote Howard T. Odum (1924-2002), the eminent American ecologist, on a similar theme: “To see these patterns which are bigger than ourselves, let us take a special view through the macroscope.” Statistical explanations, and the weird and diffuse notions of causality that come with them, seem especially well suited to express in a comprehensible form what we see through this “macroscope” (and often what we see through microscopes). And increasingly, our understanding of many important phenomena, be they social network dynamics, terrorism and war, sustainability, macroeconomics, ecosystems, the world of microbes and viruses or cures for complex diseases like cancer, depend on us seeing clearly through some kind of macroscope to understand the statistical behavior of a population of potentially interacting elements.

Seeing clearly, however, depends on finding new and better ways to build our intuition about the general principles that take inherent randomness or contingency at the individual level and produce complex patterns and regularities at the macroscopic or population level. That is, to help us understand the many counter-intuitive statistical mechanisms that shape our complex world, we need better ways of connecting statistics with stories.


1 Actually, even defining what we mean by “explain” is a devilishly tricky problem. Invariably, different fields of scientific research have (slightly) different definitions of what “explain” means. In some cases, a statistical explanation is sufficient, in others it must be deterministic, while in still others, even if it is derived using statistical tools, it must be rephrased in a narrative format in order to provide “intuition”. I’m particularly intrigued by the difference between the way people in machine learning define a good model and the way people in the natural sciences define it. The difference appears, to my eye, to be different emphases on the importance of intuitiveness or “interpretability”; it’s currently deemphasized in machine learning while the opposite is true in the natural sciences. Fortunately, a growing number of machine learners are interested in building interpretable models, and I expect great things for science to come out of this trend. In some areas of quantitative science, “story telling” is a grave insult, leveled whenever a scientist veers too far from statistical modes of explanation (“science”) toward narrative modes (“”https://en.wikipedia.org/wiki/Just_So_Stories">just so stories"). While sometimes a justified complaint, I think completely deemphasizing narratives can undermine scientific progress. Human intuition is currently our only way to generate truly novel ideas, hypotheses, models and principles. Until we can teach machines to generate truly novel scientific hypotheses from leaps of intuition, narratives, supported by appropriate quantitative evidence, will remain a crucial part of science.

2 Another fascinating aspect of the interaction between these two modes of explanation is that one seems to be increasingly invading the other: narratives, at least in the media and other kinds of popular discourse, increasing ape the strong explanatory language of science. For instance, I wonder when Time Magazine started using formulaic titles for its issues like “How X happens and why it matters” and “How X affects Y”, which dominate its covers today. There are a few individual writers who are amazingly good at this form of narrative, with Malcolm Gladwell being the one that leaps most readily to my mind. His writing is fundamentally in a narrative style, stories about individuals or groups or specific examples, but the language he uses is largely scientific, speaking in terms of general principles and notions of causality. I can also think of scientists who import narrative discourse into their scientific writing to great effect. Doing so well can make scientific writing less boring and less opaque, but if it becomes more important than the science itself, it can lead to “”https://en.wikipedia.org/wiki/Pathological_science">pathological science".

3 Which is perhaps why the common belief that “everything happens for a reason” persists so strongly in popular culture.

4 It cannot, of course, be the entire explanation. For instance, the notion among Creationists that natural selection is equivalent to “randomness” is completely false; randomness is a crucial component of way natural selection constructs complex structures (without the randomness, natural selection could not work) but the selection itself (what lives versus what dies) is highly non-random and that is what makes it such a powerful process.

What makes statistical explanations interesting is that many of the details are irrelevant, i.e., generated by randomness, but the general structure, the broad brush-strokes of the phenomena are crucially highly non-random. The chief difficulty of this mode of investigation is in correctly separating these two parts of some phenomena, and many arguments in the scientific literature can be understood as a disagreement about the particular separation being proposed. Some arguments, however, are more fundamental, being about the very notion that some phenomena are partly random rather than completely deterministic.

5 Another source of tension on this question comes from our ambiguous understanding of the relationship between our perception and experience of free will and the observation of strong statistical regularities among groups or populations of individuals. This too is a very old question. It tormented Rev. Thomas Malthus (1766-1834), the great English demographer, in his efforts to understand how demographic statistics like birth rates could be so regular despite the highly contingent nature of any particular individual’s life. Malthus’s struggles later inspired Ludwig Boltzmann (1844-1906), the famous Austrian physicist, to use a statistical approach to model the behavior of gas particles in a box. (Boltzmann had previously been using a deterministic approach to model every particle individually, but found it too complicated.) This contributed to the birth of statistical physics, one of the three major branches of modern physics and arguably the branch most relevant to understanding the statistical behavior of populations of humans or genes.


Soapbox Science is a guest blog for any scientist to air their views on any subject of wide interest. If you’d be interested in contributing, please contact the site moderators.