Essays

Getting It Wrong

Eve Corbel

Apparently it is human nature to jump to erroneous conclusions, then deny, defend, reinterpret, confabulate—whatever it takes to hang on to them

One afternoon when I went to the daycare to pick up my three-year-old granddaughter, I discovered that she had been telling the kids and staff that “My grandma threw my mummy in the bushes.” In fact, one day in 1979 I had been walking with her mother, then age six, on the sidewalk along a thoroughfare used by trucks, when I had a premonition that a truck would crash right beside us. I grabbed her up and ran away from the road into a grassy area lined with shrubbery, and we fell together into a clump of beaked hazel. A moment later a semi trailer went over on its side and screamed down the road for a good fifty yards, throwing off sparks and debris all the way. No one was hurt, including us, and the incident entered family legend. At a recent Sunday dinner, someone had brought it up and “My grandma threw my mummy in the bushes” is what the three-year-old went away with.

As I explained the story to the daycare workers, I wondered how much my own version might have changed in the thirty-five years since the incident. At this point I had read about half of Being Wrong: Adventures in the Margin of Error by Kathryn Schulz, a wonderful meditation on getting stuff wrong as a cornerstone of human intelligence, imagination and creativity. She starts with the errors of our senses, such as the superior mirage (a trick of arctic light), the inferior mirage (the shimmering pool of water we “see” on the sun-baked highway ahead) and inattentional blindness, referring to the fact that we see with our brain, not with our eyes. When we know we are being tricked, we enjoy it—in apprehending art, jokes, optical illusions and magic tricks, for example—but otherwise we don’t.

Then she considers other errors, mainly our propensity to respond intuitively and instantly to what’s going on around us and, based on this hastily gathered evidence, to jump to conclusions that are often mistaken. To compound matters, we cling to these errors with astonishing tenacity. There is even a word for an erroneous but unshakable belief: mumpsimus, said to have been coined inadvertently by a medieval monk known for reciting the phrase “quod in ore mumpsimus” instead of the proper “quod in ore sumpsimus” (“which we have taken into the mouth”). Sumpsimus translates as “we have taken” in English; mumpsimus translates as nonsense in any language. When someone finally challenged the monk, he snapped back that he had been saying it that way for forty years and would not throw out “my old mumpsimus for your new sumpsimus.” In 1545, Henry VIII referred to both words in a speech, giving mumpsimus royal approval.

It seems to us that we remember things as though they happened yesterday, but we get the details wrong, more and more of them over time, in such a systematic way that the attrition, as Schulz puts it, “can be plotted on a graph, in what is known, evocatively, as the ‘Ebbinghaus curve of forgetting’.” That is consistent with what scientists believe now, that a memory is not stashed holus-bolus in the brain but is reassembled by various bodily bits and functions when we send for it.

Given what else we know about the nature of memory, perhaps the breadth of the human margin of error shouldn’t surprise us. In researching plagiarism as part of my teaching on editorial ethics, I came across “Speak, Memory,” an article by Oliver Sacks about his own distortions of memory. The experience, for instance, of describing a vivid memory only to have a sibling declare that it didn’t happen, or happened to someone else; or of confidently composing an essay and later being taken aback to discover that one has already written that essay—much of it verbatim, in an act of innocent self-plagiarism. Should we be reassured by the fact that even a top-notch neurologist is susceptible to these errors, or should we be alarmed? Brain imaging, he says, shows that a vivid memory brings on “widespread activation in the brain” in a pattern that is the same whether the memory is real or false. He mentions the work of Elizabeth Loftus, who has implanted fake memories in people’s brains with “disquieting success.” In a ScienceDaily post that I read on the bus two days later (“Your memory is no video camera”), a neuroscientist named Donna Jo Bridge says that the human memory “is built to change, not regurgitate facts.” To cope with the continuous fast changes around us, our memory constantly rearranges bits, supplanting the old with the new, and creating “a story to fit [our] current world.” Thanks to MRI technology, we can pinpoint the exact moment when new information pops in and colonizes an older memory. What the—? No wonder we get stuff wrong!

And the human memory is suggestible in even more ways. Politicians have more extramarital affairs than other people, right? No, but they get more publicity, so the information is more available, so that’s what we think. If we see the words bananas and vomit together, we hook them up as cause and effect. If we have heard the word eat recently, we are more likely to complete “SO_P” as soup than soap. And if you want us to vote for higher school funding, put the polling station in a school. Bold type, high contrast, rhyming slogans, prominent blue or red in the copy—these, not careful research and thought, are the elements of messages that we find most convincing.

These last humiliating tidbits come from another wonderful book, Thinking, Fast and Slow, a remaindered copy of which literally fell into my hands from the sale shelf in a crowded bookstore. The author, the economist and psychologist Daniel Kahneman, offers another sort of entrance to questions about why we get things wrong, and why we won’t let go of erroneous beliefs. It has to do with the human cognitive process, he says, and its two main informants: System 1, which is intuitive, emotional and lightning-fast in interpreting input; and System 2, the cooler head, which works slowly to compare things, weigh the options, follow the rules. System 2 also keeps us civil when we feel the urge to act on an extreme feeling. Both systems live everywhere, not lodged in the left or right brain, and there’s some overlap. But generally, System 1 has its antennae out all the time, picking up data like a Twitter addict; then it trolls through the memory, stashing bits wherever they will strengthen associations already in place, and discarding the bits that don’t. System 2 is the careful, analytical, sober second thought, but with limited resources, like a fact checker with no access to a library or the internet. Both of these systems are good and magical and necessary, Kahneman writes, and both are a bit lazy. They put together the best story from what comes to hand, in a process he calls WYSIATI: What You See Is All There Is. System 1 infers much from little and reduces unfamiliar material to heuristics and bias. For example, the question “How much would you contribute to save an endangered species?” is easily replaced in our minds by the simpler question “How much emotion do I feel when I think of dying dolphins?” and we don’t even notice it. System 2, which doesn’t get out much and usually fails to notice when System 1 is introducing error by elision, “casually endorses” a lot of erroneous associations. In other words, our minds manage the data load by maintaining associative order: “everything reinforcing everything else.” One can’t help thinking of the guy who searches for his lost keys under a streetlamp, rather than where he thinks he lost them, because the light is better there. We suppress ambiguity, and therefore we see a world that is probably more coherent and user-friendly than if we tried to absorb and process all the random sensory input available. That is a healthy impulse, but it requires intellectual shortcuts.

And some stubbornness. We feel very sure of our perceptions and beliefs, and we go to some trouble to stay that way. We deny, deploy defences, confabulate, refuse or reinterpret evidence—anything to maintain our moorings. “Certainty is lethal to two of our most redeeming and humane qualities, imagination and empathy,” Kathryn Schulz writes. Our instincts serve us well. We need them to think, to live, to enjoy living.

At that point I set down Being Wrong for a couple of days, during which I happened to hear a Radiolab podcast called “Are You Sure?” It included the story of Penny Beerntsen, an American woman who was raped by a stranger. Even as the man attacked her, she had the presence of mind to get a good look at him and try to scratch him to leave marks (this was 1985, before DNA testing was considered reliable). Shortly afterward, Beerntsen identified the man from mug shots. She also picked him out of an eight-man police lineup: not only did she recognize him immediately, but the sight of him set her to trembling and raised the hair on the back of her neck. The attacker was convicted and sent to prison. Some years later he persuaded the court to run DNA tests on the evidence, and it turned out that he was not the man who had attacked her.

Talk about being wrong! How could that happen? Beerntsen did everything right. She fought back, then tried to focus on gathering evidence, then listened to her own body at the police lineup. Can anyone be sure of anything, ever?

I plunged into Being Wrong, Part III, The Experience of Error, about our epic struggle to prove to ourselves that we are not wrong. Sometimes we do change our beliefs, but even then, as Daniel Kahneman reports, we tend to move seamlessly, or blindly, through the points between “thinking that we are right and knowing that we were wrong,” either too quickly or too gradually to notice the shift. And thanks to the endless updates to our memory—also not noticed by us—we remember our former beliefs as being much closer to our current ones than they actually were, expunging the memory of having changed our minds. When we do fall into the “terrain of pure wrongness,” as Penny Beerntsen did, or anyone who loses a wad in a market downturn, or gets betrayed by a lover, we’re lost.

Our ability to incorporate error can also be affected by our mood at the moment, our place of residence, our time of life. Teenagers are adored and loathed for being obdurate, for example, and the wisdom of our elders owes a lot to their growing certainty that one cannot be sure of anything. (What we know now about the act of remembering—to gather wisps and to wait for the processes to line up—is exactly how it feels, physically, to me and my sixty-something friends.) Things move more quickly and effortlessly for younger folks, so it is no wonder they are more taken aback when they find out they erred, and no wonder they are more given to reconstructions: the time-frame defence (my timing was off), the near-miss defence (I was almost right), the out-of-left-field defence (I got messed up by the unforeseeable) and so on.

When I turned the page to chapter 11, Denial and Acceptance, what should I find but the story of Penny Beerntsen, the woman who misidentified her attacker. Schulz points out that although eyewitness accounts are the most convincing evidence in courtrooms, they are appallingly faulty: the most careful and observant witnesses get about 25 percent of the details wrong. (The rest of them get 26 to 80 percent wrong.) Yet we cling to our memories and beliefs, most of which are formed from crude, tiny data, and we tend to deny evidence to the contrary, or bend it to fit.

Reading this I was reminded of a workshop for magazine publishers a few months earlier, led by Craig Silverman, a journalist with a special interest in media errors, corrections, accuracy and verification—very useful in these days of fabricated stories, doctored images, robot social media feeds and general dreck. Among other things, he spoke of our reluctance to let go of wrong information once we have accepted it as truth. He mentioned a study subtitled “The Persistence of Political Misperceptions,” which showed that after people read and accepted false and unsubstantiated data—about the Iraqi weapons of mass destruction, for example—and then were given corrected information, few of them changed their minds. For a good number of subjects there was even a backfire effect: the factual, substantiated information only strengthened their original (erroneous) conviction.

Our refusal to live in doubt is largely a healthy instinct, Schulz writes—invoking Plato, Augustine, Freud, Kübler-Ross and others—when it is “sincere and subconscious.” It protects us from the anxiety and horror of feeling wrong all the time, or (shudder) constantly second-guessing and dithering, and living among others who do the same. But the price we pay is high. What about Holocaust denial, to bring up a painful example? Or climate change resistance? Or the “death panels” freakout of 2009, when Sarah Palin raised the spectre of bureaucrat-driven euthanasia when the US government broached the subject of socialized medicine?

The day after I read the Penny Beerntsen chapter in Being Wrong, I tuned in to another Radiolab podcast, “The Man Behind the Maneuver,” put together by a journalist who as a child had been saved from choking to death by a school nurse who applied the Heimlich Maneuver. This journalist found Henry Heimlich (now in his nineties) living in Cincinnati and went to talk to him. It is surprising and wonderful to hear the story of the Maneuver from Heimlich himself, whose work is so legendary that I was amazed to hear he was still alive. And there’s more. Not long after his rise to fame and celebrity (such as appearances on TV to teach Johnny Carson and David Letterman how to do the Maneuver), Heimlich declared that the use of the Maneuver would relieve asthma, and later he proposed it as a treatment for drowning victims. On a roll, he then offered cures for cancer and AIDS that were, to spin it as kindly as possible, eccentric. Heimlich’s colleagues and his own family finally managed to get these claims discredited, but it took years because of Heimlich’s absolute certainty, backed up by his authority, reinforced by his show-biz fame. In 2005, the Red Cross declared that back-slapping—still our instinct when someone is choking—is just as effective as the Heimlich Maneuver. Five back thumps between the shoulder blades, to be exact, then the “abdominal thrusts,” as the Maneuver is now officially known. In the Radiolab interview with Heimlich years later, though, he still sounds certain about all of it.

We all strive to be certain. Certainty is a good feeling, even a necessary one if your well-being depends on it—if you’re an elected official, say, or a witness to a crime, or a mother of teenagers. That’s because not only do we feel more confident when we are certain, but also we are easily seduced by apparent certainty and confidence in others. Studies have shown that we much prefer the decisive, assured political candidate (or teacher, or boss) to the less certain one, regardless of other qualities: statistically we are more likely to vote for the confident liar than the good guy who voices doubts. At a teacher training workshop I once attended, the instructors were unequivocal on what to do if one had so much as the shadow of a doubt in the classroom: “Act as if.” We also prefer doctors who are certain—even if they turn out to be mistaken, even if we have read the UK study that found clinicians to be “completely wrong” about diagnoses 40 percent of the time.

We invest the confident one with even more credibility and authority if she is taller, has a certain kind of face and, especially, if she is a celebrity. Heimlich’s fame as a lifesaver and TV personality was rock solid when he recommended the Maneuver for people who were drowning; otherwise he would have been ridden out of town on a rail. And one dynamic, attractive, confident celebrity like Jenny McCarthy, the actor and TV host who argues a causal connection between childhood vaccines and autism, can convince a lot more people a lot more quickly than a thousand practitioners of the plodding, painstaking, incremental work of medical science.

By the time I was about three-quarters of the way through Being Wrong, I was half-convinced that Grandma had probably never fallen into any bushes with Mummy, or thrown her in, or anything else. Meanwhile, bits of apparently related material had begun to come in from all directions and all media, à la Night of the Living Dead. Articles on paper and online, blog posts, novels, radio podcasts, ten-year-old books in the endcaps at the library, conversations overheard on the train, meaningless TV shows suggested by my partner’s Netflix account—everything interrogated or answered everything else. Was this a magical instance of interdependent co-arising? Or had I become a walking example of the tendency to strive for “cognitive ease,” as Daniel Kahneman calls it, cherry-picking incoming data so that everything reinforces everything else?

Only a few days after tucking into the Kahneman, I pulled my New Yorker out of the mailbox and found “The Gift of Doubt,” a piece about the late Albert O. Hirschman, an economist. It starts with the story of a nineteenth-century railway megaproject in the northeastern United States: a tale of extensive planning, testing and predicting, and of colossal underestimating of time, trouble and expense. Hirschman was also a planner of megaprojects, and he was particularly interested in “unintended consequences and perverse outcomes,” having observed that when everything goes wrong with a large project, people find solutions that they had never dreamed of, solutions much more elegant and useful and long-lasting than the original objectives, forged from the heat of terrific unexpected stress. He wondered if “the only way in which we can bring our creative resources fully into play is by misjudging the nature of the task”—to plan it and undertake it as if it were “routine, simple, undemanding of genuine creativity.”

Well, I have never had responsibility for an infrastructure project, but I can name a number of stories and articles and books and magazines that would never have been written, designed, produced or marketed, had their authors and publishers known what they were getting into. My heart goes out to the proprietors of the 35 percent of small businesses that don’t make it to the five-year mark because of optimism bias, but isn’t it better, existentially, to have loved and lost?

And speaking of failing, I’ve seen more published eurekas, op-eds and how-to’s about failure in the last six months than in my whole life before that. Failing is good. Failing is natural. It’s all right to fail. Embrace failure. Make your children embrace failure. Failing makes you smart. Adapt: Why Success Always Starts with Failure. The Rise: Creativity, the Gift of Failure, and the Search for Mastery. Brilliant Blunders: From Darwin to Einstein—Colossal Mistakes by Great Scientists That Changed Our Understanding of Life and the Universe. You Are Not So Smart (subtitle ends with “and 46 Other Ways You’re Deluding Yourself”).

Some of this writing—maybe all of it—is illuminating and bracing: for example, Paul Tough’s book How Children Succeed, showing that success is born of curiosity, perseverance and resourcefulness, rather than high test scores; and To Forgive Design: Understanding Failure, by Henry Petroski, who has been writing for thirty years about buildings that fall down, bridges that collapse, dams that break and other failures that alert us to “weaknesses in reasoning, knowledge, and performance that all the successful designs may not even hint at.”

But wait. Error and failure are different, aren’t they? If a bridge gives way fifty years after it was designed and constructed, having supported many more vehicles weighing many more tons than the engineers could have predicted, is that a design error? Well, sometimes it is. In at least one case, in Quebec, an investigation showed that some small components of a failed bridge, called eyebars, could wear down but could not be inspected.

In fact, there is so much failure writing that a body of pushback lit is also accumulating. In an online post, Sam McNerney pinpoints the typical agenda: that it is admirable (and fashionable) to fail, but only if we eventually achieve success, which is measured in terms of fame and/or fortune. No one writes a story about their permanent failure, and no one wants to read such a story. Liza Mundy, in her article “Losing Is the New Winning,” mentions eight new books on the subject. She notes that the failure fad opens the door for the Eliot Spitzers of the world to exhibit their pain, absolve themselves, be admired and then get back in the race. It’s important to forgive people for doing bad things, but as Mundy writes, “When is a public figure’s failure a sign of abiding character flaws, and when is it a harbinger of growth?” The same can be said of civilians.

There are interesting connections to be explored between cognitive error and failure, and the McNerney and Mundy articles and others like them raise good points. But both Being Wrong and Thinking, Fast and Slow are mentioned, and I can’t help thinking that in landing the evidence, the authors got some bycatch. Both Kathryn Schulz and Daniel Kahneman are clear on why they went to the trouble of writing it all down for us. Both writers are enchanted by the elegant, sophisticated workings of our minds, and both speak of public conversation rather than private redemption.

Kahneman, who describes human intuition as “marvelous,” says that his purpose is to give readers the skills to “identify and understand errors of judgment and choice,” in ourselves and others, “by providing a richer and more precise language to discuss them.” Schulz ends her book with accounts of people and institutions who have acknowledged error, apologized for it, gathered data on what went wrong, and changed policies and practices to fix it. Her examples include Beth Israel Deaconess Medical Center and the US commercial aviation industry, whose errors can be devastating. The emphasis is on striving for consistent quality, measuring and analyzing results, making decisions based on real data rather than assumptions and guesswork, apologizing for error and changing whatever caused that error. Being right brings out the worst in us, Schulz says, but when we do accommodate fallibility and acknowledge wrongness, we become more compassionate. For starters, we can learn to listen—even in the noisy, egocentric, brand-crazy society that swirls around us.

Listening is also the example offered by Albert O. Hirschman, the megaproject planner who was interested in unexpected endings, and who avoided conventional criteria for measuring success. In surveying large World Bank-sponsored projects on four continents, ranging from irrigation to power transmission, he recommended that a project not be evaluated just by measuring benefits, but also by asking how many “conflicts… it brought in its wake,” and “crises… it occasioned and passed through.” He welcomed adversity, and he welcomed doubt. Kathryn Schulz, too, says that we could do worse than to remember Socrates, who tried to fill his students with uncertainty—not fearfulness but aporia, “active, investigative doubt.”

At this point, part of me wants to go back into my journals and read my immediate account of the day in 1979 when I picked up my daughter and ran until we fell into the bushes. Having read about human error over the last year, I’m pretty sure I know what I will find, or not find, in the journals. Another part of me wants to leave it alone and wait to see how the story continues to evolve—by elision, or embellishment, or just plain error.

Tags
No items found.

Eve Corbel

Eve Corbel is a writer, illustrator, cartoonist, mom and grandma. Her writing and artwork have been published in numerous anthologies and periodicals, including Geist.

SUGGESTIONS FOR YOU

Reviews
Michael Hayward

Vanishing Career Paths

Review of "The Last Bookseller: A Life in the Rare Book Trade" by Gary Goodman, and "A Factotum in the Book Trade" by Marius Kociejowski.

Dispatches
David M. Wallace

Red Flags

The maple leaf no longer feels like a symbol of national pride.

Dispatches
Mazzy Sleep

Heart Medicine

"You have bruises / There was time / You spent trying to / Heal them. / As in, time wasted."