Managing Human Error in the Investment Process

Download a PDF Version Here

All investors make mistakes. Mistakes happen not only because of misjudgment but the nature of investing. Mistakes arise from universal conditions within the investment world.1 In other words, it’s usually not the person that’s the source of the mistakes, it’s the environmental and situational factors. It’s the system.

Rarely do we acknowledge and understand the system. We neglect environmental factors and reflexively attribute mistakes to personal factors: laziness, inattentiveness, ignorance, etc.

Investors operate in a complex world with imperfect information and an unpredictable future. Add in additional pressure from clients and organizations and investors are primed to err.

How leaders handle human error separates the great investment teams from the average. Better assessment and understanding of errors build a competitive advantage.

Yes, investors make mistakes. But they’re not made in isolation. It’s the system issues that exacerbate personal mistakes. The big idea is resolving “system” issues that will lessen the effect of unavoidable personal shortcomings.

It’s less about telling teams to:

·         work hard

·         pay more attention to detail

·         make fewer mistakes

·         follow additional processes

·         be more efficient

It’s more about:

·         removing conflicting organizational goals and expectations

·         establishing honest communication

·         allowing teams to be adaptable, not rigid

·         acknowledging the unavoidable nature of human error

·         removing strict silos in the organization

How do we do this?

James Reason has studied how organizations deal with human error. He’s worked with military and civilian aviation groups and medical organizations to develop better methods of managing human fallibility in high-risk environments. He’s published several books on the management of human error.

James describes two schools of thought on assessing human error. The popular “person” focused approach and the less common “system” focused approach.

Person approach

The longstanding and widespread tradition of the person approach focuses on the unsafe acts - errors and procedural violations…It views these unsafe acts as arising primarily from aberrant mental processes such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness.2

We are all familiar with the person approach. When something goes wrong, leaders search for the person responsible and recommend additional training and/or punishment. While this is the most common approach, it’s typically superficial, simplistic, and fails to solve the problem.

System Approach

As Reason explains, the System approach is more useful:

The basic premise in the system approach is that humans are fallible and errors are to be expected, even in the best organisations [Note: the quotes contain several British spellings]. Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in "upstream" systemic factors. These include recurrent error traps in the workplace and the organisational processes that give rise to them. Countermeasures are based on the assumption that though we cannot change the human condition, we can change the conditions under which humans work.3

The system approach makes two major adjustments to mitigating error.

First, it accepts human fallibility as an unavoidable aspect of any process. The goal is to work with the mistakes, not eradicate them.

Second, the system approach recognizes the power of environmental and situational factors imposed on teams.

To sum up the two approaches:

The person approach focuses on the errors of individuals, blaming them for forgetfulness, inattention, or moral weakness.

The system approach concentrates on the conditions under which individuals work and tries to build defences to avert errors or mitigate their effects.4

Every organization must fight the following urge:

Blaming individuals is emotionally more satisfying than targeting institutions.5

What makes investment organizations susceptible to these mistakes? What should be done about them?

The following seven principles will help organizations resolve underlying system problems, thereby improving team performance and morale.

1.       Adaptability, not Efficiency

Many investment organizations are run like an early 20th-century industrial firm. Teams are treated like factory workers – maximize every last bit of efficiency by creating rigid processes and minimizing downtime.

In his book, Team of Teams: New Rules of Engagement for a Complex World, General Stanley McChrystal describes how we can no longer optimize as we did in the past:

The models of organizational success that dominated the twentieth century have their roots in the industrial revolution and, simply put, the world has changed. The pursuit of “efficiency”— getting the most with the least investment of energy, time, or money — was once a laudable goal, but being effective in today’s world is less a question of optimizing for a known (and relatively stable) set of variables than responsiveness to a constantly shifting environment. Adaptability, not efficiency, must become our central competency.6

Humans are not machines. Performance, not efficiency, is the goal. I’ve seen managers worried when people leave their desks to talk with a colleague because it means they’re not reading a 10-K or listening to an earnings call. They completely disregard the value of unstructured idea-sharing. It’s this sharing of ideas that uncovers exceptional opportunities or identifies unseen risks. Managers need to worry less about measurable output, like the number of companies analyzed, and encourage the spontaneous, free flow of information across traditional investment silos.

Adaptability is the key, not blind adherence to procedure. Urban Meyer, former football coach at Ohio State and Florida, describes the necessity of adaptation:

The ability to be flexible and responsive in today’s competitive environment is a mandatory skill. The best athletes and teams are exceptional at adjusting and adapting to changing circumstances. It is foolish to resent or resist change. A rapidly changing world deals ruthlessly with people who fail to adapt. If you don’t like change, you are going to like irrelevance even less.7

2.       Failure as Normal

As Reason explains, high-reliability organizations directly confront failure:

High reliability organisations- which have less than their fair share of accidents- recognise that human variability is a force to harness in averting errors, but they work hard to focus that variability and are constantly preoccupied with the possibility of failure…They expect to make errors and train their workforce to recognise and recover them…Instead of isolating failures, they generalize them. Instead of making local repairs, they look for system reforms.8

Some organizations don’t talk about failure in the hopes that ignoring it will make it go away. Failure can’t be fixed if it isn’t revealed.

As Ray Dalio, founder of Bridgewater, states in his book Principles:

Don’t hide your mistakes and pretend your perfect. Find your imperfections and deal with them. You will either learn valuable lessons from your mistakes and press on or you won’t and will fail.9

Dalio recommends developing an organizational “reflexivity” towards mistakes.

…develop a reflexive reaction to pain that causes you to reflect on it rather than avoid it – it will lead to rapid learning/evolving.10

3.       Reporting Culture

Reporting mistakes and errors should be rewarded, not penalized. Reason explains:

Effective risk management depends crucially on establishing a reporting culture. Without a detailed analysis of mishaps, incidents, near misses, and "free lessons," we have no way of uncovering recurrent error traps or of knowing where the "edge" is until we fall over it. The complete absence of such a reporting culture within the Soviet Union contributed crucially to the Chernobyl disaster.11

Strict, top-down management discourages discussing problems. We need to stop managing investment organizations with Soviet-era strategies. Many organizations talk about open communications but don’t practice it. They’re always open to positive feedback, but not the negative.

McChrystal elaborates about the fear of revealing imperfection. Organizations would rather pretend it doesn’t exist than actually face reality:

In early 2003, when I served as the vice director for operations on the Pentagon’s Joint Staff, the United States Central Command (CENTCOM) initially prohibited the Pentagon staffs from viewing their internal Web site out of a common fear of giving “higher headquarters” visibility into unfinalized planning products. Such absurdities reflect the truth that most organizations are more concerned with how best to control information than how best to share it.12

The most valuable people in any investment organizations are those on the front lines – the analysts and portfolio managers dealing with what’s happening in real-time. It’s impossible for one CIO to track and manage every development across every asset class. It’s just too much. Investment teams must be comfortable sharing errors both up and down the chain of command. Penalizing error reporting prevents any possibility of honest feedback.

4.       Understand Latent Conditions

Latent conditions are the unsaid rules, norms, and expectations in any organization. They’re not formalized rules. These conditions lie dormant until circumstances cause them to be a problem. Latent conditions evolve over time and exert excess pressure on your team.

Reason explains:

Latent conditions are the inevitable "resident pathogens" within the system. They arise from decisions made by designers, builders, procedure writers, and top-level management. Such decisions may be mistaken, but they need not be. All such strategic decisions have the potential for introducing pathogens into the system.13

Latent conditions are the unintended consequences arising from complex organizations. They can’t always be predicted, but they can be remedied.

Reason explains:

Latent conditions…can translate into error provoking conditions within the local workplace (for example, time pressure, understaffing, inadequate equipment, fatigue, and inexperience) and they can create longlasting holes or weaknesses in the defences (untrustworthy alarms and indicators, unworkable procedures, design and construction deficiencies, etc.) Latent conditions-as the term suggests-may lie dormant within the system for many years before they combine with active failures and local triggers to create an accident opportunity.14

One example from investing is the conflicting desire for high returns with low risk. In insurance, it’s trying to get higher yields with the same or less credit risk. In the pension world, it’s trying to hit the 7 or 8% assumed return with minimal downside risk. Investors know it’s not possible to have higher returns with lower risk, but this doesn’t stop leaders or clients from demanding it. It creates a nightmare for the investment team to meet an impossible goal. Something must give and the team knows it. It puts the investment team in a helpless position, guaranteed to disappoint.

5.       Specialist Capability + Generalized Knowledge

One way to better manage the “system” is by merging specialist capability with generalized knowledge. That is, investors specialize in their respective areas but remain engaged with other areas of the organization.

There is an increasing need to rapidly share ideas across asset classes and strategies. It no longer works to keep information isolated. The only way to protect the organization is to share ideas across all teams.

Special operations units, like the Navy SEALs, need to have both generalized and specialized capabilities. McChrystal explains:

Team members tackling complex environments must all grasp the team’s situation and overarching purpose. Only if each of them understands the goal of a mission and the strategic context in which it fits can the team members evaluate risks on the fly and know how to behave in relation to their teammates. Individual SEALs have to monitor the entirety of their operation just as soccer players have to keep track of the entire field, not just their own patch of grass. They must be collectively responsible for the team’s success and understand everything that responsibility entails.

We did not want all the teams to become generalists — SEALs are better at what they do than intel analysts would be and vice versa. Diverse specialized abilities are essential. We wanted to fuse generalized awareness with specialized expertise.15

Here’s the takeaway for investment firms. Yes, let people specialize in their areas. But create time to share ideas. It doesn’t have to be lengthy, just consistent. It’s the cohesive understanding that spots opportunities and risks. In past organizations, I’ve found that many teams have no idea what the other teams are doing, where they see risk, or where they are finding opportunity.

6.       Value Purpose over Procedure

Blindly following established procedures with no appreciation of its underlying purpose creates systemic issues. It’s a consequence of rewarding rule-following and covering your ass rather than deeply thinking about the mission.

The aviation industry found this out the hard way. On December 28, 1978, Flight 173 crashed in Florida after running out of fuel. The reliance on strict procedures pushed the captain to follow rules rather than fly the plane.

McChrystal explains:

In the case of Flight 173, the time spent retrieving flashlights, putting on jackets, zipping books into bags, and reassuring passengers was a deadly waste. Of course, no crew member would have knowingly risked lives just to keep books from spilling across the cockpit, but they were so determined to follow procedure that they lost track of what mattered. They were doing things right, just not doing the right thing…The crew’s attachment to procedure instead of purpose offers a clear example of the dangers of prizing efficiency over adaptability.16

Make sure your team understands the underlying purpose of any procedure. The goal isn’t to check the box. It’s to better understand reality.

7.       The Dinosaur’s Tail: The Leader as a Bottleneck

Sometimes the best thing a leader can do is to get out of the way. The Navy SEALs developed “empowered execution” to increase the speed and quality of decision making.

McCrystal explains:

Within such complexity, leaders themselves can be a limiting factor. While the human capacity for thought and action is astounding, it is never quite enough. If we simply worked more and tried harder, we reason, we could master the onslaught of information and “urgent” requirements. But of course we can’t...I would tell my staff about the “dinosaur’s tail”: As a leader grows more senior, his bulk and tail become huge, but like the brontosaurus, his brain remains modestly small. When plans are changed and the huge beast turns, its tail often thoughtlessly knocks over people and things. That the destruction was unintentional doesn’t make it any better.17

As leaders become inundated with meetings, emails, requests, and speaking engagements, their tail grows. It’s frustrating when a leader doesn’t get around to approving an actionable idea. It kills the entrepreneurial and ambitious drive of the team. What is often seen as “laziness” and “going through the motions” is people giving up on trying to move the organizational needle.

 

Sources:

1.      James Reason. Patient Safety, Human Error, and Swiss Cheese

2.     James Reason. Patient Safety, Human Error, and Swiss Cheese

3.     James Reason. Patient Safety, Human Error, and Swiss Cheese

4.     James Reason. Patient Safety, Human Error, and Swiss Cheese

5.     James Reason. Patient Safety, Human Error, and Swiss Cheese

6.     Stanley McChrystal. Team of Teams: New Rules of Engagement for a Complex World

7.      Urban Meyer. Above the Line: Lessons in Leadership and Life

8.     James Reason. Patient Safety, Human Error, and Swiss Cheese

9.     Ray Dalio. Principles: Life and Work

10.   Ray Dalio. Principles: Life and Work

11.   James Reason. Patient Safety, Human Error, and Swiss Cheese

12.   Stanley McChrystal. Team of Teams: New Rules of Engagement for a Complex World

13.   James Reason. Patient Safety, Human Error, and Swiss Cheese

14.   James Reason. Patient Safety, Human Error, and Swiss Cheese

15.   Stanley McChrystal. Team of Teams: New Rules of Engagement for a Complex World

16.   Stanley McChrystal. Team of Teams: New Rules of Engagement for a Complex World

17.   Stanley McChrystal. Team of Teams: New Rules of Engagement for a Complex World