Senior Lecturer Massey University
Share this content

The danger of cognitive biases in the aviation industry

22nd Feb 2019
Senior Lecturer Massey University
Share this content
Plane in the air
guvendemir/iStock

At Academics' Corner we feature the best HR researchers that tell you what they’ve found and what you need to do differently on the back of the research. If you’re an academic with a relevant story, please get in touch on [email protected].

We all make decisions every day. Get up now or grab an extra few minutes in bed? Toast or cornflakes? Dress up or dress down? Which candidate should we choose? Finish the report or leave it until tomorrow?

The reality of modern life is that many decisions must be made quickly and without access to all of the relevant information.

In situations like this, people often rely upon rules-of-thumb or mental shortcuts—what psychologists call heuristics—to guide their decision making. Even when time is not an issue and all of the relevant information is available, people often use heuristics as a means to reduce mental workload.

The dark side of heuristics

Decisions made using heuristics often provide quick and adequate solutions.

However, since the early 1970s, a considerable body of work by Nobel Prize winner Daniel Kahneman and his colleagues has demonstrated that the use of heuristics can lead to systematic errors (biases), whereby decision makers may deviate from what might reasonably be considered rational or good judgements.

In other words, whilst heuristics may provide adequate decisions, they can also lead to people making imperfect, expensive or dangerous choices.

From small-scale consequences...

Fortunately, the consequences of making the wrong choice will often be minor. Deciding against a raincoat because last night’s weather forecast was fine, even although some dark clouds were looming when you left home, may mean getting wet—but it’s hardly the end of the world.

It is widely accepted that flying into deteriorating weather is the most common antecedent of fatal accidents in small aircraft.

The anchoring and adjustment bias occurs when people make a decision on the basis of initial evidence (their anchor), and then fail to sufficiently adjust their views in the face of more recent evidence to the contrary.

...to serious problems

Sometimes, the consequences of making the wrong choice will be expensive. Turning a blind eye to your sales manager’s sexist jokes, because so far everyone seems to have found them amusing, may backfire when the jokes offend a major international client about to sign the biggest deal your company has ever seen.

When decisions are based primarily on the outcome of the behaviour (no one complained and everyone had a giggle), rather than on its merits at the time (sexist jokes are an anachronism), the decision-maker has fallen prey to outcome bias. In hindsight, we no doubt realise that overlooking such behaviour was unwise as it was only a matter of time until serious offence was caused.

Ironically, after an event has occurred, assuming the event was more predictable than it was at the time has a name too: hindsight bias.

Honing in on the aviation industry

For several years now, my colleagues and I have been investigating the effect of cognitive biases on decisions made in the aviation industry. Like other high-risk industries, such as nuclear power, mining and oil production, poor decisions in aviation can be dangerous and potentially place many lives at risk.

Deciding to file a report about unsafe, unwise, or illegal behaviour is overwhelmingly influenced by the outcome of the behaviour.

One of the questions we have explored is how pilots of light aircraft decide whether it is safe to continue a flight into deteriorating weather conditions: should they turn back, divert to a different destination or continue to their intended destination?

We found that pilots are likely to continue a flight into deteriorating weather conditions based on an earlier favourable weather report, rather than consider new evidence to the contrary, such as a lack of visibility or storm clouds that they can see through the cockpit window ahead of them.

This is a much more serious example of how the anchoring and adjustment bias can affect a decision. It is widely accepted that flying into deteriorating weather is the most common antecedent of fatal accidents in small aircraft.

Why do people not report unsafe practices?

We were also interested in why people frequently choose not to report unsafe practices. Our studies have found that deciding to file a report about unsafe, unwise, or illegal behaviour is overwhelmingly influenced by the outcome of the behaviour.

Unsafe behaviours that turn out bad are much more likely to be reported than unsafe behaviours that turn out well, which is another example of the effect of the outcome bias mentioned earlier.

At face value, this probably seems right; an intoxicated driver who causes a fatal accident is much more likely to be handed down a custodial sentence than a similarly intoxicated driver who makes it home unscathed but is apprehended as they park in their garage.

When making a decision under time pressure, they [pilots] often favour a ‘positive testing strategy’ which may lead to confirmation bias.

However, logic dictates that two identical decisions or behaviours made with the same information and under the same conditions should be evaluated equivalently, regardless of the outcome; after all, the decisions to drive whilst intoxicated were made before the outcome was known.

The only difference between the two drivers is that one was luckier than the other.

When outcome bias leads to underreporting of safety concerns, valuable lessons that might improve safety are lost to the industry as a whole.

The problem of 'positive testing strategy'

Another common bias we have observed in pilots is that, when making a decision under time pressure, they often favour a ‘positive testing strategy’ which may lead to confirmation bias.

That is, when making a decision, pilots favour evidence consistent with their initial expectations, whilst ignoring or placing less weight on evidence to the contrary.

The problem with this strategy is that it is often possible to find evidence to support an expectation, even when it is wrong; however, just one piece of negative evidence is enough to prove an initial expectation is wrong.

This means that, when testing a belief (e.g. is it safe to continue a flight to my intended destination?) pilots should seek out negative evidence, on the basis that if it is found, it unequivocally demonstrates that their belief is wrong.

Our suspicion is that cognitive biases are not just pervasive, but are hard-wired into our brains.

However, we found that pilots frequently decided to continue a flight into deteriorating weather conditions based on confirmatory evidence such as ‘another aircraft just flew the same route with no problems’, rather than choosing to turn back because there was disconfirmatory evidence such as ‘the cloud is getting thicker.'

How can we avoid falling prey to cognitive biases such as the ones mentioned above?

In a 2011 article in the Harvard Business Review, entitled ‘Before you make that big decision’, Kahneman and colleagues provided a 12-question checklist to help executives vet their decisions. In principle, the checklist appears straight-forward, easy to implement and based on evidence from research.

But how well does debiasing actually work in practice?

In several studies where we have tried to reduce the effects of cognitive biases in student pilots, we have had little success. Even when we tried to reduce cognitive bias in a group of psychology students who had just completed a lecture on cognitive biases we had no success!

Our suspicion is that cognitive biases are not just pervasive, but are hard-wired into our brains. Perhaps learning that many decisions may be biased is a useful starting point.

So, are my own decisions affected by cognitive biases? If I answered ‘of course not’, that would be a good example of the overconfidence bias.

Replies (2)

Please login or register to join the discussion.

Kate Wadia
By Kate Wadia
01st Sep 2016 11:24

Thank you Andrew for terrifying and interesting reading! I have 2 take-away points from your article: firstly that awareness of "mental shortcut" must allow for a space before to check in on our professional self (or colleague) for the bias, a cognitive "stop and search". Secondly - having just noted another analogy from the aviation industry about project contingency planning and bias against thinking about plan B's (t.co/iSSvFVZVoW) - I conclude that on the most difficult questions of mindset, HR should perhaps ask "Now what would we all do if we were flying a plane?"!

Thanks (1)
Andrew Gilbey
By Andrew Gilbey
01st Sep 2016 22:41

Thanks for your thoughts, Kwadia - you're dead right on the first one and that second article is very interesting - I'm reading it right now. That question you pose is spot-on, as HR decisions may also be devastating when wrong.

Thanks (1)