10 Ideas I Learned This Week From Being Wrong: Adventures in the Margin of Error

Kathryn Schulz’s Being Wrong: Adventures in the Margin of Error provides numerous examples of how we fail in our decisions and why we habitually repeat the same mistakes. There are several practical ideas that will improve decision making at the personal and organizational level. The challenge, as with any type of change, is with the implementation. Sure, it’s easy to say you’ll seek out ideas you don’t agree with, but how many of us actually do that? The key is putting ideas into practice and treating these like a skill that can be developed. Reading about these ideas without using them doesn’t improve your ability. These ideas are especially useful for those in knowledge-based fields, where you don’t always get the tangible feedback like you would in a physical trade.

 The 10 Best Ideas

 o   Follow the scientific method

o   Memories are fallible

o   People confabulate – they make up things because they can’t say “I don’t know”

o   Our beliefs are overly based on personal experiences, not evidence-based research

o   Record ideas you don’t agree with

o   Communities can fuel erroneous beliefs

o   You need at least one dissenter in group decisions

o   Leaders, especially those in the public spotlight, are expected to never admit “I don’t know” or that they have changed their mind

o   Have conviction while still being open to being wrong

o   Use Forcing functions – these let us know if we are making a mistake and guide corrective action

 

 1. Follow the scientific method

From Being Wrong:

As an ideal of intellectual inquiry and a strategy for the advancement of knowledge, the scientific method is essentially a monument to the utility of error. Most of us gravitate toward trying to verify our beliefs, to the extent that we bother investigating their validity at all. But scientists gravitate toward falsification; as a community if not as individuals, they seek to disprove their beliefs. Thus, the defining feature of a hypothesis is that it has the potential to be proven wrong (which is why it must be both testable and tested), and the defining feature of a theory is that it hasn’t been proven wrong yet. But the important part is that it can be—no matter how much evidence appears.

In fact, not only can any given theory be proven wrong; as we saw in the last chapter, sooner or later, it probably will be. And when it is, the occasion will mark the success of science, not its failure. This was the pivotal insight of the Scientific Revolution: that the advancement of knowledge depends on current theories collapsing in the face of new insights and discoveries. In this model of progress, errors do not lead us away from the truth. Instead, they edge us incrementally toward it.

The scientific method is a way of approaching and understanding life– it’s not just for scientists. It’s the best method we have to understand and interpret the world around us. Uncovering misbeliefs and mistakes is not failure. It’s all part of the process to evaluate your beliefs and update them when confronted with better evidence. Changing your mind painful. Admitting you’ve been doing or thinking something wrong for years or decades is uncomfortable. Unless, you admit that everyone errors and your only job is to continue to uncover your mistakes and move on. The worst thing you can do is have unrealistic expectations of your beliefs and then cling to them even when faced with insurmountable evidence.

 2. Memories are fallible

From Being Wrong:

National tragedy is good to memory researchers. In 1986, when the space shuttle Challenger exploded, Neisser saw an opportunity to remedy this gap in the memory literature, and to find out whether his own mistaken Pearl Harbor recollection was an anomaly. He surveyed his students about their memories of the disaster the day after it happened, and then again three years later. The results spelled the end of conventional flashbulb memory theory. Less than 7 percent of the second reports matched the initial ones, 50 percent were wrong in two-thirds of their assertions, and 25 percent were wrong in every major detail. Subsequent work by other researchers only confirmed the conclusion. Our flashbulb memories might remain stunningly vivid, but research suggests that their accuracy erodes over time at the same rate as our everyday recollections—a decline so precise and predictable that it can be plotted on a graph in what is known evocatively, as the Ebbinghaus curve of forgetting.

Memories are fallible. You should discount memories (including your own) since they are not perfect recordings of the past. Memories erode and are biased to make us look better. Anything we recall the past we should discount what we hear. It’s never an unbiased account, even when we try our hardest to remove any after-the-fact adjustments. This is one reason why writing and recording your thoughts and idea is powerful. It cements the thinking at the time of decision, not how you tend to recall it.

3. People confabulate – they make up things because they can’t say “I don’t know”

From Being Wrong:

Hirstein says that once he began studying confabulation, he started seeing sub-clinical versions of it everywhere he looked, in the form of neurologically normal people “who seem unable to say the words, ‘I don’t know,’ and will quickly produce some sort of plausible-sounding response to whatever they are asked.” Such people, he says, “have a sort of mildly confabulatory personality.”

In other words, if we want to discredit a belief, we will argue that it is advantageous, whereas if we want to champion it, we will argue that it is true. That’s why we downplay or dismiss the self-serving aspects of our own convictions, even as we are quick to detect them in other people’s beliefs.

You can urge people not to believe anything based on meager evidence until you are blue in the face, but you will never succeed—because, as it turns out, believing things based on meager evidence is what people do. I don’t mean that we do this occasionally: only when we are thinking sloppily, say, or only if we don’t know any better, or only en route to screwing up. I mean that believing things based on paltry evidence is the engine that drives the entire miraculous machinery of human cognition.

People confabulate – there’s an irresistible urge to say anything other than “I don’t know.” Life is complicated and we can’t know everything, so we pretend to know. For the most part, it’s not a problem. But for big decisions in complicated environments, confabulation leads to grave mistakes. We are lazy. It’s easier to say we know something then to do the work and actually prove it. You have to challenge those around you anything you suspect someone talking beyond their ability or knowledge.

 

4. Our beliefs are overly based on our personal experience, not evidence-based research

From Being Wrong:

Human beings, on the other hand, have no trouble answering these questions, because we don’t care about what is logically valid and theoretically possible. We care about what is probable. We determine what is probable based on our prior experience of the world, which is where evidence comes in: we choose the most likely answer to any given question based on the kinds of things we have (and haven’t) experienced in comparable situations before.

Many of our beliefs are based on our personal experience, which accounts for some infinitesimal part of humanity’s history. Just because we experienced something had no relation to the truth or the right answer. We overweight our own perspectives, neglecting the perspectives of millions (billions) of other people throughout history that likely have different perspectives. It’s a cautionary principle to not trust something because you saw it, but because there is good evidence to support it.

5. Reflect on ideas that disagree with your current beliefs

From Being Wrong:

One person who recognized the value of doing this was Charles Darwin. In his autobiography, he recalled that, “I had, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones.”

Darwin recorded ideas that disagreed with his beliefs; we tend to expediently erase any idea that disagrees with our current beliefs. We don’t like to admit we are wrong. It’s our ego that tries to protect us and ultimately gets in the way of us learning from our mistakes. Learning from mistakes happens when you actively engage with ideas you disagree with, not when you keep embracing ideas you already accept.

6. Communities can fuel erroneous beliefs

From Being Wrong:

I call this problem our disagreement deficit, and it comes in four parts. Boiled down to their barest essence (we will unboil them in a moment), these parts are as follows. First, our communities expose us to disproportionate support for our own ideas. Second, they shield us from the disagreement of outsiders. Third, they cause us to disregard whatever outside disagreement we do encounter. Finally, they quash the development of disagreement from within.

These factors create a kind of societal counterpart to cognition’s confirmation bias, and they provoke the same problem. Whatever the other virtues of our communities, they are dangerously effective at bolstering our conviction that we are right and shielding us from the possibility that we are wrong.

Communities can fuel erroneous beliefs. While there are many benefits to communities, things can go wrong when community-fueled beliefs are accepted without rigorous examination. Communities are identities. We tend to identify with certain beliefs and stop thinking. We don’t examine why. Communities exert conformity on its members, whether deliberate or indirect. The best way to overcome this is to actively seek out communities that disagree with your existing communities.

 

7. You need at least one dissenter in group decisions

From Being Wrong:

As a result, internal dissent, unlike outside opposition, can be deeply destabilizing. Consider one of the striking findings of the Asch line studies: if just one of the fake subjects begins giving the right answers, all the real subjects start doing so as well. Seen from one angle, this finding is heartening, since it suggests that a single person speaking freely suffices to break the stranglehold of conformity—like the little boy pointing out that the emperor has no clothes. Seen from a different angle, however, it suggests that a lone dissident can destroy the cohesiveness of an entire community.

You must have one true dissenter in any group decision. It only takes one person to negate the effects of groupthink and peer pressure. Someone has to challenge ideas (and not just by role-playing a devil’s advocate position, but actually having an opposing view). As long as you have one dissenter, you give confidence to other group members who may be thinking the same thing but don’t want to be alone.

8. Leaders, especially those in the public spotlight, are expected to never admit “I don’t know” or that they have changed their mind

From Being Wrong:

William Hirstein even suggests that, when it comes to those in power, we often feel that “an answer that is possibly (or even probably) wrong is better than none at all.” Translation: we are more alarmed by leaders who waver than by those who screw up.

As the late renowned military historian Barbara Tuchman observed, “to recognize error, to cut losses, to alter course, is the most repugnant option in government.” This is Hamlet all over again: we notice the uncertainty, hesitations, and reversals without noticing (or caring) what inspired them. No matter how merited doubt and admissions of error might be, we loathe them in our political leaders, and associate them—inaccurately, but indissolubly—with weakness

Leaders (especially in politics) have no ability to say they don’t know or that they changed their mind as they will be eviscerated for their ignorance or flip-flopping. Both options draw instant ridicule and condemnation. But that’s a terrible expectation to set, because no one can have a thoughtful position on every possible topic and those who learn will be changing their mind as they grow. This will likely never be solved in politics, but for those in business, create a culture that recognizes the importance of saying “I don’t know” and changing your mind.

9. Have conviction while still being open to being wrong

From Being Wrong:

The psychologist Rollo May once wrote about the “seeming contradiction that we must be fully committed, but we must also be aware at the same time that we might possibly be wrong.” Note that this is not an argument for centrism, or for abandoning the courage of our convictions. May’s point was precisely that we can retain our convictions—and our conviction—while jettisoning the barricade of certainty that surrounds them. Our commitment to an idea, he concluded, “is healthiest when it is not without doubt, but in spite of doubt.”

There’s a tendency to prize conviction and confidence, at the expense of accuracy. You can have strong beliefs, just don’t let those beliefs ignore evidence that conflicts with your belief or neglect the irreducible uncertainty in your beliefs. It’s not an either/or situation. You can have confidence and still be open to being wrong.

10. Use forcing functions – these let us know if we are making a mistake and guide corrective action

From Being Wrong:

These red flags in our environment are, in essence, a kind of forcing function—the engineer’s term of art for features of the physical world that alert us to the fact that we are making a mistake.

Forcing function – a mechanism to shape behavior when we make a mistake. The more we can build automatic tools to guide decisions, the better we can catch mistakes and avoid the cognitive overload of us trying to remember in our heads what to do.