13 Ideas You Should Know From: Mistakes Were Made But Not By Me by Carol Tavris and Elliot Aronson

The 13 Big Ideas:

  • Contrary Evidence Won’t Always Change Minds; in fact in may increase their resistance

  • The difference between self-justification and lying

  • We all make mistakes; the difference is who will admit and learn from them

  • The Backfire Effect – when trying to change opinions with evidence backfires

  • If you are looking for advice, don’t ask those who just made the choice

  • Groups and teams need to have feedback, criticism, and pushback

  • Memories aren’t like retrieving a picture – we add/subtract details to make ourselves look better

  • Beware of people adding in fancy language to be more persuasive

  • Need to ask – how do you know you are right?

  • Vivid experiences are not forgotten and shape future behavior

  • Probability, Not Certainty

  • When people make a mistake – don’t lecture; instead, try to understand

  • Embrace the role of mistakes; our culture mistakes the role of mistakes

 

My Highlights From the Book:

Contrary Evidence Won’t Always Change Minds; in fact in may increase their resistance

Most people, when directly confronted by evidence that they are wrong, do not change their point of view or plan of action but justify it even more tenaciously. Politicians, of course, offer the most visible and, often, most tragic examples of this practice.

When politicians’ backs are against the wall, they may reluctantly acknowledge error but not their responsibility for it. The phrase “mistakes were made” is such a glaring effort to absolve oneself of culpability that it has become a national joke—what the political journalist Bill Schneider called the “past exonerative” tense. “Oh, all right, mistakes were made, but not by me, by someone else, someone who shall remain nameless.”

The difference between self-justification and lying

Self-justification is not the same thing as lying or making excuses. Obviously, people will lie or invent fanciful stories to duck the fury of a lover, parent, or employer; to keep from being sued or sent to prison; to avoid losing face; to avoid losing a job; to stay in power. But there is a big difference between a guilty man telling the public something he knows is untrue (“I did not have sex with that woman”; “I am not a crook”) and that man persuading himself that he did a good thing. In the former situation, he is lying and knows he is lying to save his own skin. In the latter, he is lying to himself. That is why self-justification is more powerful and more dangerous than the explicit lie. It allows people to convince themselves that what they did was the best thing they could have done.

Yet mindless self-justification, like quicksand, can draw us deeper into disaster. It blocks our ability to even see our errors, let alone correct them. It distorts reality, keeping us from getting all the information we need and assessing issues clearly. It prolongs and widens rifts between lovers, friends, and nations. It keeps us from letting go of unhealthy habits. It permits the guilty to avoid taking responsibility for their deeds. And it keeps many professionals from changing outdated attitudes and procedures that can harm the public.

We all make mistakes; the difference is who will admit and learn from them

To err is human, but humans then have a choice between covering up and fessing up. The choice we make is crucial to what we do next.

The Backfire Effect – when trying to change opinions with evidence backfires

People who receive disconfirming or otherwise unwelcome information often do not simply resist it; they may come to support their original (wrong) opinion even more strongly—a backfire effect. Once we are invested in a belief and have justified its wisdom, changing our minds is literally hard work. It’s much easier to slot that new evidence into an existing framework and do the mental justification to keep it there than it is to change the framework.

If you are looking for advice, don’t ask those who just made the choice

You can see one immediate benefit of understanding how dissonance works: Don’t listen to Nick. The more costly a decision in terms of time, money, effort, or inconvenience, and the more irrevocable its consequences, the greater the dissonance and the greater the need to reduce it by overemphasizing the good things about the choice made. Therefore, when you are about to make a big purchase or an important decision—which car or computer to buy, whether to undergo plastic surgery, or whether to sign up for a costly self-help program—don’t ask someone who has just done it…That person will be highly motivated to convince you that it is the right thing to do.

Groups and teams need to have feedback, criticism, and pushback

“In normal circumstances,” wrote Hitler’s henchman Albert Speer in his memoirs, “people who turn their backs on reality are soon set straight by the mockery and criticism of those around them, which makes them aware they have lost credibility. In the Third Reich there were no such correctives, especially for those who belonged to the upper stratum. On the contrary, every self-deception was multiplied as in a hall of distorting mirrors, becoming a repeatedly confirmed picture of a fantastical dream world which no longer bore any relationship to the grim outside world. In those mirrors I could see nothing but my own face reproduced many times over.”

Our greatest hope of self-correction lies in making sure we are not operating in a hall of mirrors, in which all we see are distorted reflections of our own desires and convictions.

We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off. This is especially important for people in positions of power.

Memories aren’t like retrieving a picture – we add/subtract details to make ourselves look better

But most of us, most of the time, are neither telling the whole truth nor intentionally deceiving. We aren’t lying; we are self-justifying. All of us, as we tell our stories, add details and omit inconvenient facts; we give the tale a small, self-enhancing spin. That spin goes over so well that the next time we add a slightly more dramatic embellishment; we justify that little white lie as making the story better and clearer. Eventually the way we remember the event may bring us a far distance from what actually happened.

Moreover, recovering a memory is not at all like retrieving a file or playing a tape; it is like watching a few unconnected frames of a film and then figuring out what the rest of the scene must have been like. We may reproduce poetry, jokes, and other kinds of information by rote, but when we remember complex information, we shape it to fit it into a story line.

Beware of people adding in fancy language to be more persuasive

In their paper “The Seductive Allure of Neuroscience Explanations,” Deena Weisberg and her colleagues demonstrated that if you give one group of laypeople a straightforward explanation of some behavior and another group the same explanation but with vague references to the brain thrown in (“brain scans indicate” or “the frontal-lobe brain circuitry known to be involved”), people assume the latter is more scientific—and therefore more real. Many intelligent people—including psychotherapists—fall prey to the seductive appeal of this language, but laypeople aren’t called upon in court to try to explain what it means.16

Need to ask – how do you know you are right?

Clinical intuition—“I know it when I see it”—is the end of the conversation to many psychiatrists and psychotherapists, but the start of the conversation to the scientist—“A good observation, but what exactly have you seen, and how do you know you are right?” Observation and intuition without independent verification are unreliable guides; like roguish locals misdirecting the tourists, they occasionally send everyone off in the wrong direction.

Vivid experiences are not forgotten and shape future behavior

Thus, people do not repress the memory of being tortured in prison, being in combat, or being the victim of a natural disaster (unless they suffered brain damage at the time), although details of even these horrible experiences are subject to distortion over the years, as are all memories. “Truly traumatic events—terrifying, life-threatening experiences—are never forgotten, let alone if they are repeated,” says McNally. “The basic principle is: if the abuse was traumatic at the time it occurred, it is unlikely to be forgotten.

Probability, Not Certainty

Yet that kind of certainty is the hallmark of pseudoscience. True scientists speak in the careful language of probability—“Innocent people most certainly can be induced to confess, under particular conditions; let me explain why I think this individual’s confession is likely to have been coerced”—which is why scientists’ testimony is often exasperating.

Yet training that promotes the certainties of pseudoscience, and fails to instill a humbling appreciation of our cognitive biases and self-delusions, increases the chances of wrongful convictions in two ways. First, it encourages law enforcement officials to jump to conclusions too quickly. A police officer decides that a suspect is the guilty party and then closes the door to other possibilities. A district attorney decides impulsively to prosecute a case, especially a sensational one, without having all the evidence; she announces her decision to the media and then finds it difficult to back down when subsequent evidence proves shaky. Second, once a case is prosecuted and a conviction won, officials will be motivated to reject any subsequent evidence of the defendant’s innocence.

Imagine, for a moment, how you would feel if your partner, your grown child, or your parent said: “I want to take responsibility for that mistake I made; we have been quarreling about it all this time, and now I realize that you were right, and I was wrong.” Or if your employer started a meeting by saying, “I want to hear every possible objection to this proposal before we go ahead with it—every mistake we might be making.”

When people make a mistake – don’t lecture; instead, try to understand

“Ironically, this natural tendency to lecture may be one of the worst things a family member or friend can do,” Pratkanis says. “A lecture just makes the victim feel more defensive and pushes him or her further into the clutches of the fraud criminal.” Anyone who understands dissonance knows why. Shouting “What were you thinking?” will backfire because it means “Boy, are you stupid.” Such accusations cause already embarrassed victims to withdraw further into themselves and clam up, refusing to tell anyone what they are doing.

Therefore, says Pratkanis, before a victim of a scam will inch back from the precipice, he or she needs to feel respected and supported. Helpful friends and relatives can encourage the person to talk about how his or her values influenced what happened, and then listen uncritically to the answers. Instead of irritably asking “How could you possibly have listened to that creep?” you say, “Tell me what appealed to you about the guy that made you trust him.”

Embrace the role of mistakes; our culture mistakes the role of mistakes

“Our culture exacts a great cost psychologically for making a mistake,” Stigler recalled, “whereas in Japan, it doesn’t seem to be that way. In Japan, mistakes, error, confusion [are] all just a natural part of the learning process.”16 (The boy eventually mastered the problem, to the cheers of his classmates.) The researchers also found that American parents, teachers, and children were far more likely than their Japanese and Chinese counterparts to believe that mathematical ability is innate; if you have it, you don’t have to work hard, and if you don’t have it, there’s no point in trying.