“The greatest obstacle to discovery is not ignorance, it is the illusion of knowledge.” -The Art of Thinking Clearly by Rolf Dobelli
We all make hundreds of decisions every day. They range from the predictable – what to eat for breakfast, to the complex – making a significant investment in the market. For the straightforward, repeatable tasks, we’ve developed routines and habits to automate those decisions. This automation serves us well. We’ve freed up our mental capacity to shift from debating non-essential decisions to focusing on major, life-changing decisions.
When we debate any high value decision, most of us follow a predictable pattern. We figure out the desired outcome, gather as much information as possible, and weigh all the information as logically as we can to come up with the right choice.
Of course, there are problems that occur along the way. First, we may not define our goals well enough to know what we want. This leads to good decisions that solve the wrong problem. We come to a logical conclusion, but the conclusion doesn’t deliver a meaningful outcome because we didn’t specify our goal in advance.
Or the problem can occur at decision time. We have all the evidence we need, but we incorrectly weigh the evidence. Perhaps we overweight a salient piece of evidence that actually has little predictive value. For example, when deciding to travel, some people avoid flying because they recall graphic, high profile plane crashes. Now the evidence is clear - air travel is extremely safe. However, we tend to ignore the uneventful, yet valid evidence and keep the few vivid crashes overrepresented in your mind. So there are always mistakes in how we balance and interpret the evidence we have.
The one problem I believe is the most counterintuitive is how we gather our information. How do we screw up the step? Don’t we just gather up the information we need and move on? Isn’t more information better? How tough can it be to sift through all the evidence and separate the useless from the valuable? It’s never a simple data collection process - there’s quite a few errors we make when deciding how to choose the information we consume.
The goal is not always about finding more information, it’s about using the information we have, better.
Principle #1: Focus on disconfirming evidence
There is a natural tendency to seek and agree with evidence that already support your view. Our minds instinctively gravitate towards information that confirms what we believe. It’s nice to feel that reassurance. It’s uncomfortable to think about how we might be wrong. While facing our flaws isn’t easy, ignoring them is disastrous.
Kathryn Schulz, author of Being Wrong, explains why we need to keep our information and views as testable hypotheses, ready to be overturned by falsification and disconfirming evidence.
From Being Wrong:
As an ideal of intellectual inquiry and a strategy for the advancement of knowledge, the scientific method is essentially a monument to the utility of error. Most of us gravitate toward trying to verify our beliefs, to the extent that we bother investigating their validity at all. But scientists gravitate toward falsification; as a community if not as individuals, they seek to disprove their beliefs. Thus, the defining feature of a hypothesis is that it has the potential to be proven wrong (which is why it must be both testable and tested), and the defining feature of a theory is that it hasn’t been proven wrong yet. But the important part is that it can be—no matter how much evidence appears to confirm it, no matter how many experts endorse it, no matter how much popular support it enjoys.
When gathering information, focus on information that disagrees with your initial assumption. In investments, if you are bullish on a stock, find analysts and investors that are short the stock. Use their perspective to challenge your beliefs and identify flaws in your reasoning. It’s easy to become defensive. You must overcome this tendency.
The goal is to make it hard to fall in love with your own ideas. If we were perfect, we wouldn’t need this step. Our flaws in decision making is well documented by research. The more you challenge your cherished beliefs, the better decision maker you will become.
Principle #2: Your ego is a bigger concern than more information
Sometimes the goal shouldn’t be making smarter decisions, but removing our dumb mistakes. It’s a different perspective because we tend to think about adding more intelligence, when instead we should be removing our bad mistakes. Our ego convinces us that we don’t ever make dumb decisions. However, we still do dumb things, especially when making decisions.
The biggest challenge to removing our faults is our ego. As mentioned before, we don’t like to deliberate about our shortcomings and mistakes. Most people bury their head in the sand and hope the mistakes go away. Unfortunately, not only does the mistake not go away, it grows and compounds until it becomes catastrophic. The story below, from Being Wrong, illustrates the strength of our ego in defending our bad decisions.
In 1977, the psychologists Richard Nisbett and Timothy Wilson set up shop in a department store in Michigan, where they asked people to compare what they claimed were four different varieties of pantyhose. In reality, all the hose were the same, but that didn’t prevent shoppers from showing a preference for one of them. Moreover, it didn’t stop them from explaining their preference, by claiming that (for instance) this color was just a little more appealing or that fabric was a little less scratchy…
It’s weird enough that these shoppers chose between identical pantyhose in the first place, but it is even weirder that they generated explanations for those choices. After all, they could have just shrugged and declined to explain their decisions. We are expected to be able to justify our beliefs, but not so our taste. “I just like that one; I couldn’t tell you why” is a perfectly acceptable explanation for why we’re attracted to a particular pair of pantyhose. Since there were no differences among the pantyhose, these accounts couldn’t have been the real reasons behind the shoppers’ choices; they could only be post-hoc justifications.
At first, this experiment seems to demonstrate a strange but basically benign quirk of human cognition: we like to explain things, even when the real explanation eludes us. But it has a sobering epilogue. When Nisbett and Wilson revealed the nature of the experiment to its unwitting subjects, many of them refused to believe that the pantyhose were identical. They argued that they could detect differences, and they stuck by their original preferences.
The problem is never in the errors we make, it’s the after-the-fact rationalizations we create to justify and cover up our mistakes. Imagine how much we could improve if we acknowledged our incompetence and used feedback to learn and improve our beliefs? We need to defeat our sense of entitlement. While we have the right to believe anything we want, it doesn’t mean we should. It’s a poor way to live our lives.
“Many people seem to feel that their opinions are somehow sacred, so that everyone else is obliged to handle them with great care. When confronted with counterarguments, they do not pause and wonder if they might be wrong after all. They take offense.” -Jamie Whyte, author of Crimes Against Logic
Principle #3 Neglecting the outside view
Another information flaw is neglecting the outside view. When we make a decision, we need to distinguish between the inside and outside view. Respected investor Michael Mauboussin has written extensively on the use of the inside/outside views. The excerpts below are from Mauboussin’s report, The Base Rate Book: Integrating the Past to Better Anticipate the Future
The Inside View
As Mauboussin states,
“There is a natural and intuitive approach to creating a forecast of any kind. We focus on an issue, gather information, search for evidence based on our experience, and extrapolate with some adjustment. Psychologists call this approach the “inside view.”
The Outside View
In contrast, Mauboussin states,
“The 'outside view' considers a specific forecast in the context of a larger reference class. Rather than emphasizing differences, as the inside view does, the outside view relies on similarity. The outside view asks, ‘What happened when others were in this situation?’ This approach is also called ‘reference class forecasting.’”
Why is it necessary to separate the inside and outside views? There are two general ways to approach a decision. First, we can think about our problem using the current information we have. If we are thinking about the likelihood of success for an investment, we can think about today’s information – the fundamentals of the company, the current economic environment, etc. This is the inside view.
The outside view uses information from a larger class, that is, the investment history of comparable investments. The outside view shows how comparable investments, made under similar conditions, performed in the past. Are there common historical attributes that can inform the current decision? We might ask, how have past investments done at a similar valuation level? Using historical information gives us a base rate to judge the current opportunity. If these investments have typically struggled, we should think harder about whether the current opportunity is likely to succeed.
Those who rely solely on the inside view often overweight anecdotal evidence that provides little value in predicting success. We’re more tempted to fall for seductive narratives and stories rather than a larger body of past evidence. The inside view gives a one-sided, biased view because it’s a small sample size – a sample size of one. The outside view provides a larger sample size and more predictive power than you can derive just from looking at the current investment.
We run into problems when we overweight the specifics of a current situation and neglect the outside view. Current information is always much more salient and persuasive – it’s fresh in our minds and seems to be more relevant. But outside views are more powerful because they bring in thousands, if not millions, of past data points that have a higher likelihood of prediction success.
In a perfect world, how would we go about evaluating all this evidence? As it turns out, we have fairly strong and uniform opinions about this. By rough consensus, the ideal thinker approaches a subject with a neutral mind, gathers as much evidence as possible, assesses it coolly, and draws conclusions accordingly. - From Being Wrong
The goal is not to know everything. The goal is to know where your knowledge ends, because then you can compensate and adjust for the things you don’t know. But if you don’t know you don’t know, you repeatedly fall into the same traps. When you repeat the same mistakes, you can’t grow and improve your life.