The Curse of Vague Thinking: How Empty Rhetoric and Hollow Words Mislead Investors

Download a PDF version here

Vague thinking surrounds us. It’s permeated politics (“come together in a bipartisan way”), business management (“leverage synergies and create win-wins”), corporate mission statements (“build a corporate culture that respects and values the unique strengths…”), and of course, investing. 
Every day we are inundated with empty rhetoric used to hide, rather than reveal, the truth. It promotes laziness and obscures incompetence. It’s a tempting way to communicate. It’s the path of least resistance.

Vague communication is standard in the investment world. Market experts talk in vague generalities and obfuscating one-liners that do nothing to further investors’ understanding of the markets.

The investment world is an unrelenting assault of vague language. Vague verbiage has become a parasite in our quest to make wise investment decisions. Everyone pretends to know the future. No one admits ignorance. The illusion of knowledge is real. It’s become absurd.

What do I mean by vague language?

• Empty rhetoric

• The art of saying something without saying anything substantive

• Hedged advice

• Generic, open-ended forecasts containing no practical value

Here are examples I pulled from recent Street reports with my interpretation underneath:

• “The phase has changed; we are now in a new cycle, which will have its own unique characteristics”

  • Of course, things have changed – any idiot can see that. But did you predict this in advance for your clients? What should investors do about it now?

  • What does a new cycle mean for portfolio positioning? Should we take more or less risk?

  • What specifically are these characteristics? Unique in what way?

• “However, there is the risk if not the likelihood of an uneven recovery, with significant setbacks along the way and some permanent damage”

  • Define “risk”. Is it a 10%, 50%, or 90% chance? There’s always a risk, so it’s nothing new. Exactly how “uneven” do you mean? And for how long?

  • What specific setbacks? Setbacks can always happen. Investors need to know which ones in advance, otherwise it’s no help.

  • Define permanent damage. To what degree? Will it really be forever or some other timeline?

• “We might have seen the lows if—and it’s a big if—there is no resurgence of the virus as we open up the economy”

  • Of course we “might” have seen the lows. Anything “might” happen. We need to understand the probability and magnitude of that chance. Does “might” mean a 20% chance or 80% chance?

  • Obviously a resurgence in the virus would affect the market. But how likely is it to occur? Without a reasonable estimate, it’s an unactionable statement for investors.

At best, vague thinking conceals incompetence. At worst, it hides malice. Some don’t realize the softness of their language. Others know exactly what they’re doing and count on investors’ inattention.

We all fall for it, but not for lack of IQ. We just don’t pay attention. There is less and less rigor with our conversations. We need to stop being lazy and start being demanding.

We’re trying to root out ultracrepidarianism - the habit of giving opinions and advice on matters outside of one's knowledge.

Steven Levitt and Steven Dubner, authors of Think Like a Freak: Secrets of the Rogue Economist, describe why the incentive to fake expertise is too strong for most people to ignore:

Every time we pretend to know something, we are doing the same: protecting our own reputation rather than promoting the collective good. None of us want to look stupid, or at least overmatched, by admitting we don’t know an answer. The incentives to fake it are simply too strong.1

Investors tend to be lazy and indiscriminate when interpreting information. We decide with soundbites, not evidence. We lean on stories, not verifiable data. We trust the illusion of confidence without verifying the substance. We favor authority and pedigrees over analysis and process.

Vague thinking is not an isolated phenomenon. Kathryn Schulz, author of Being Wrong: Adventures in the Margin of Error, states how we tend to favor confident yet incorrect answers rather than accurate answers expressed with uncertainty:

As social psychologists can tell you, both doubt and certainty are as contagious as the common cold: all else being equal, our own confidence increases around people who radiate assurance, and our own doubts flare up around the hesitant. It’s no surprise, then, that in politics (as in business, the military, and the sixth-grade student council), we typically anoint the ultraconfident to lead us. William Hirstein even suggests that, when it comes to those in power, we often feel that “an answer that is possibly (or even probably) wrong is better than none at all.” Translation: we are more alarmed by leaders who waver than by those who screw up.2

We fall for vague words implying credibility but providing no substance. In their paper, “The Seductive Allure of Neuroscience Explanation,” Deena Weisberg and team describe how we associate fancy jargon with higher credibility:

…if you give one group of laypeople a straightforward explanation of some behavior and another group the same explanation but with vague references to the brain thrown in (“brain scans indicate” or “the frontal-lobe brain circuitry known to be involved”), people assume the latter is more scientific- and therefore real. Many intelligent people – including psychotherapists – fall prey to the seductive appeal of this language...3

Investment firms toss around buzz words like artificial intelligence, machine learning, and multi-factor models in an attempt to convey expertise.

The one undeniable talent talking heads have is their skill at telling a compelling story with conviction.4

No one holds us accountable to think critically. It’s hard so we slip into lazy habits.

Most people don’t have the time or inclination to think very hard about big problems like this. We tend to pay attention to what other people say and, if their views resonate with us, we slide our perception atop theirs.5

Two things need to happen:

First, exert more effort and diligence challenging the information we consume.

Second, demand more skin in the game from our advisors. Make a concerted and deliberate attempt to separate what people actually know vs. what they say they know.

This is not easy. It takes persistence.

There are four principles to combat vague thinking:

• Make Experts Quantify Predictions and Provide Uncertainty Estimates

• Force Experts to Go Deep

• Distinguish Between Talking About the Past vs. Predicting the Future

• Incorporate Accountability

Principle #1: Make Experts Quantify Predictions and Provide Uncertainty Estimates

Knowing what to measure, and how to measure it, can make a complicated world less so. There is nothing like the sheer power of numbers to scrub away layers of confusion and contradiction, especially with emotional, hot-button topics.6

Quantitative thinking cuts through the noise and vagueness. It’s not perfect, but it’s a substantial step in the right direction. Make experts convert vague, qualitative statements into something quantifiable. Measurement and judgment of claims are essential. Vague statements sidestep this scrutiny. Experts need to talk in numbers – probabilities, estimates, and confidence intervals.

Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.7

Quantifiable statements force ownership. It makes it harder for them to spin their statements in a positive light regardless of how the future turns out.

The first step is recognizing qualifying words. These words allow unlimited leeway to literally say anything. “Could”, “Should”, “Likely”, “May”, “Possibly”, etc.

Anytime you see a qualifier, force experts to quantify.

Philip Tetlock calls qualifying words “hollow” words:

If we are serious about measuring and improving, this [using hollow words] won’t do. Forecasts must have clearly defined terms and timelines. They must use numbers.8

As Tetlock suggests, we also need firm timelines. A forecast without a timeline is worthless. Anything can be true if you give it enough time. Make experts commit.

Define uncertainty. Don’t neglect it. Every forecast has uncertainty. Forecasts only have value if investors can understand the level of uncertainty inherent in the forecast. A prediction stocks will rise 10% next year isn’t valuable if we don’t know a) the confidence level and b) the range of outcomes surrounding the estimate.

In an uncertain environment, rigor is not found in precise, single point predictions, but rather in precisely defined uncertainty estimates…It is not obtained by selecting the one right vision for the future, but enables you to anticipate and prepare for multiple futures.9

By forcing experts to quantify, we unearth hidden assumptions and get a deeper look at the thought process, rather than just taking a number at face value.

Experts don’t like to talk about uncertainty because it makes them look ignorant and more vulnerable than they want to appear.

Principle #2: Force Experts to Go Deep

Both experts and investors prefer more information understood superficially than less information understood deeply. It’s tempting to think the answer is gathering more info when it’s really the need to spend more time understanding the information we already have. Discuss deeply, not superficially.

It’s better to go deep and understand a handful of keep points than it is to try to cover 100 pages of a quarterly economic report. Nobel laureate Herbert Simon says it best:

…A wealth of information creates a poverty of attention. The more items we attend to, the less time we can allot to each, and the less well we will perform any one of them.10

We need to spend more time uncovering how experts think and care much less about the forecast itself. It’s the process, not the outcome that’s valuable. We’re trying to establish what’s repeatable and valuable with our experts. We do that by understanding their methods and logic.

Breaking down a forecast into its component parts enables better understanding. Philip Tetlock describes how famed physicist Enrico Fermi approached breaking down a question:

What Fermi understood is that by breaking down the question, we can better separate the knowable and the unknowable. So guessing—pulling a number out of the black box—isn’t eliminated. But we have brought our guessing process out into the light of day where we can inspect it. And the net result tends to be a more accurate estimate than whatever number happened to pop out of the black box when we first read the question. Of course, all this means we have to overcome our deep-rooted fear of looking dumb. Fermi-izing dares us to be wrong.11

Experts love to talk quickly and glance over the little details underlying their arguments. Stop the conversation and have them explain the building blocks of their thought process. Understand the process block by block. Discover the rigor behind their approach.

Real knowledge comes when people do the work…On the other hand, we have the people who don’t do the work — they pretend. While they’ve learned to put on a good show, they lack understanding. They can’t answer questions that don’t rely on memorization. They can’t explain things without using jargon or vague terms. They have no idea how things interact. They can’t predict consequences.12

Keep digging. Most experts explain by adding more vague details on top of the already vague prediction. They’re not used to people pushing back and pressing the fine details. Don’t let them off the hook. It’s your investments on the line.

Principle #3: Distinguish Talking About the Past vs. Predicting the Future

If you pay attention, you’ll realize most investment commentary is actually investment reporting – talking about things that have already happened. It’s easy to talk about the past because, obviously, it’s already happened. Hindsight makes experts of us all. Predicting the future though is much harder. Don’t confuse the two.

Beware of commentators that talk about past information as if they had always known it and had predicted it all along. All they’re doing is simply regurgitating what’s already happened and hoping we confuse basic reporting with their ability to provide sound advice.

Now, there is value in reporting what’s happened, but don’t confuse it with predictive ability. Because that’s how it will seem in the moment when they talk about the past. It’s going to seem like they knew it all along. They didn’t.

Explaining complicated events with hindsight sounds really impressive. But the real value is predicting these events ahead of time.

Clarify what time period people are talking about. Is it the past? Or is it the future? It’s funny how they’ll be so confident describing the past but the minute the conversation shifts to the future, here comes the mushy language and hollow words hiding the fact they have no idea what will happen.

Principle #4: Incorporate Accountability

The final principle is tracking experts over time. It’s the last step to validate who is worthwhile vs who is worthless. Converting vague forecasts to quantifiable figures is a start. Now we need to keep track of the predictions and finish the analysis of who adds value over time.

Accountability drives calibration. Calibration measures how often predictions actually agree with reality. If alignment is tight, it’s a sound process. If not, there’s little substance to their process.

One of the most important tests of a forecast—I would argue that it is the single most important one—is called calibration. Out of all the times you said there was a 40 percent chance of rain, how often did rain actually occur? If, over the long run, it really did rain about 40 percent of the time, that means your forecasts were well calibrated. If it wound up raining just 20 percent of the time instead, or 60 percent of the time, they weren’t. Calibration is difficult to achieve in many fields. It requires you to think probabilistically, something that most of us (including most “expert” forecasters) are not very good at. It really tends to punish overconfidence—a trait that most forecasters have in spades.13

Experts will never do this for you. Forecasters rarely track their predictions. Like most people, they tend to assume they’re above average and don’t need any feedback. We step in and force feedback.

There are various ways to drive accountability. You can look at investment track records over time. Good processes will show through with good performance. You can also keep past predictions they’ve published and then compare to how things actually happened. Or, you can keep notes on specific forecasts and track over time.

Final Message

The last idea is borrowed from the book Happiness of Pursuit, by Chris Guillebeau. It sums up the message of this article:

Remember the words of Kathleen Taylor, who worked with hospice patients in their final days. Once you’re near the end, there’s no time for bullshit. But what if you decided there’s no time for bullshit…far in advance of the end? What if you vow to live life the way you want right now, regardless of what stage of life you’re in?14

Investors put up with way too much bullshit in the markets. The investment world is filled with lots of articulate and confident professionals making compelling claims which fail under scrutiny. What seems persuasive on the surface is revealed as ignorant underneath. We need to know who’s doing the hard work and who is pretending. It takes deliberate effort and a skeptical attitude. No one will do this for us. As Ronald Reagan said, “Trust, but verify.” Investors do a lot of trusting, but not much verifying.

Sources:

1. Steven Levitt and Steven Dubner, Think Like a Freak: Secrets of the Rogue Economist,

2. Kathryn Schulz, Being Wrong: Adventures in the Margin of Error

3. Caroll Tavris and Elliot Aronson, Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

4. Philip Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction

5. Steven Levitt and Steven Dubner, Think Like a Freak: Secrets of the Rogue Economist,

6. Steven Levitt and Steven Dubner, Think Like a Freak: Secrets of the Rogue Economist,

7. Maria Papova, The Baloney Detection Kit: Carl Sagan’s Rules for Bullshit-Busting and Critical Thinking. https://www.brainpickings.org/2014/01/03/baloney-detection-kit-carl-sagan/

8. Philip Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction

9. J. Edward Russo and Paul Schoemaker, Winning Decisions: Getting It Right the First Time

10. Morten Hansen, Great at Work: The Hidden Habits of Top Performers

11. Philip Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction

12. https://medium.com/@ryan_reeves/the-ultimate-learning-guide-via-shane-parrish-421dee960f27

13. Philip Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction

14. Chris Guillebeau, Happiness of Pursuit: Finding the Quest That Will Bring Purpose to your Life