Hughes writes on issues relating to team and organisational performance. A former Red Arrows pilot, he is now Managing Director of Mission Excellence, a consultancy focused on improving clients’ execution – their ability to close the gap between what gets talked about and planned, and what gets done. Justin previously spent 12 years as an RAF fighter pilot and is a renowned speaker on performance and risk and has presented alongside Richard Branson and Kofi Annan. He can be found on Twitter at @justinmissionex.

Following last month’s post on the subject of humility, I thought I might follow up with something on a related theme and introduce the idea of a Red Team, whose role is to puncture the nasty little hidden assumptions in our planning and thinking.

When the CIA decided to launch a special forces mission against the suspected hideout of Osama bin Laden, what do you think was the biggest risk for the mission?

It is worth seriously considering this question before reading on. What do you think?

If you answered anything to do with helicopters, casualties or any other similar factor, they would all be good answers but in the final reckoning, incorrect. These are operational risks. They are known, mitigated as much as they can be and confirmed as acceptable or not.

How did the President answer this question?

In an interview for TV after the mission, President Obama offered a different perspective. The biggest concern at the top of the CIA was simply whether the figure living in the compound was bin Laden or not. 

Those agents who had come up with the hypothesis had been working on the case for some time. They desperately wanted to find bin Laden. They had huge emotional and professional investment in ‘being right’. They were about the least likely people to see any flaws in their own thinking.

The concept of a ‘Red Team’

So the Director of the CIA formed a ‘Red Team’. Its job was to come up with alternative hypotheses for who might be in the compound. They had not worked on the case before. They examined the evidence with a fresh set of eyes and had none of the inherent biases which come with being responsible for the thinking that got you to a certain conclusion in the first place.

The Red Team came up with a number of hypotheses to explain the people and behaviour in the compound. However on balance it was concluded that all had less conviction than the original theory.

The nature of assumptions and how to tackle them

This simple story illustrates a powerful tool in avoiding becoming too wedded to your own brilliance.  Our thinking invariably contains hidden assumptions. When it comes to the development of strategy and plans, those assumptions can become the root causes of execution failure. And the most likely person to be able to ‘stress test’ the thinking is someone else! We see things as we are.

At the most simple level, ‘peer review’ offers an example of this type of approach; this can work in almost any environment. The next level up is the formation of Red Teams, given the task of seeing a problem from a different frame of reference or the perspective of a competitor. The ideal Red Team will contain 3 different types of individual:

  1. Subject matter expert. She will know the operational realities.
  2. Creative. Left-field thinker, maybe even just somebody from a completely different department.
  3. Analyst. Left brain thinker, who will challenge the underlying logic of any argument.

The next stage up

The next level of this approach is Wargaming. This does not have to require the acquisition of supercomputers and an enormous commitment of resources. It is just about playing the plan out in very fast time around a table with independent parties playing the roles of internal and external stakeholders.

The outputs are a greater understanding of how critical decisions may affect the outcome, and for those decisions to be made based on a judgement of acceptable risk. Additionally, inter-departmental personnel will gain clarity in common purpose and priority efforts, and better understanding of the wider context of their actions, thereby improving alignment and decision-making.   

For important decisions, plans or campaigns, the effort in implementing a Red Team, or scenario modelling through Wargaming, is likely to be a fraction of the cost of failure or the reward for success. The requirement is not for more resources but for an open mind; the point is to work smarter, not harder.

For complex and changing situations, improved critical analysis and objectivity in planning can be major factors in delivering desired outcomes.

It is important to remember however that the Red Team is on our side. They are not scoring points or making people look stupid. They are making the plan better. Don’t shoot the messenger.

If it’s good enough for the CIA, it’s probably worth at least having a think about how objective you are in your organisation.