Author Profile Picture

Jamie Lawrence

Wagestream

Insights Director

Read more about Jamie Lawrence

An iceberg sunk the Titanic, right? Wrong.

pp_default1

David Buchanan BA (Hons) PhD FRSE Chartered FCIPD is Professor of Organisational Behaviour at Cranfield School of Management.

If you investigate why the Titanic sank, you’ll realise an iceberg was just the final straw.

“There were a whole host of contributory factors,” says David Buchanan, a professor at Cranfield School of Management. “The lookouts in the crow’s nest didn’t have binoculars – they’d been mislaid on the ship’s sea trial and had never been replaced. The sky was clear that night, and the sea was calm – perfect sailing conditions, but icebergs are difficult to see at night when there are no waves breaking against them. This is just the beginning – there were many other factors in play.”

The same can be applied to other catastrophes, such as the Deepwater Horizon oil spill and the loss of the NASA shuttles Challenger and Columbia.

“There’s always a short answer as to what caused an event, and a longer answer. In the Challenger disaster, for example, you could say it was caused by O-ring failure. But the investigation unveiled a whole series of contributory causes including managerial and organisational problems, communication breakdown, and poor management of subcontractors. What this means is that in most cases the fundamental cause of the extreme event comes down to failures in the system itself.”

What we’re talking about here is complex causality. It’s a key area for Professor Buchanan, who researches why organisations struggle to learn from, and improve themselves in the light of, ‘extreme events.’

One of the reasons is that complex causality allows blame to be thrown around like a hot potato. Which is, unfortunately, something that human beings are pretty good at.

“Have you ever put petrol into a diesel tank? If you have you’ll remember how angry you were at yourself. But rationally you can understand why it happens. You’re tired, caught in traffic, running late, just dropped your keys, can’t find them in the dark and then – in anger – you unscrew the petrol cap, press the handle and it’s too late. And even after all the contributing factors, you blame yourself.

“We as human beings have a strong desire to want to attribute blame. This is known as the fundamental attribution error and it means holding individuals to account while ignoring the context in which they were operating. It’s an extremely strong desire and very difficult to overcome.”

In other words, while extreme events are in the majority of cases systems errors, the fundamental attribution error kicks in and we all want to see someone on the gallows.

As well as a pre-occupation with blame, our approach to investigating extreme events does not allow us to paint a full picture of why the system failed. We’ll take an example based on Professor Buchanan’s research – a patient who was given the wrong implant in an operating theatre. After such a serious incident, what happens to those who were present and involved with the case?

“You’ll be told to write a witness statement that answers a number of questions: what you saw, what you did, what you said, what you didn’t do.  You’re told it’s confidential, but you are also told that your statement may be used in a coroner’s inquest, in response to a clinical negligence claim, and other judicial hearings.

“So are you going to be fully open and honest? Of course not. You’ve been pulled into a quasi-legal process where what you say could unwittingly implicate you or some of your colleagues. We involve people in this legalistic process, and we exclude them from investigating what the causes really were.

“And these people, if you speak to them you’ll find most are shocked and traumatised. It’s not something that’s taken lightly. They are shocked it has happened in their theatre and on their watch. I’ve come across cases where those most closely involved have not been able to return to work.”

Which makes it all the more distressing that Norman Lamb, the care services minister, wants to bring in legislation to prosecute staff if they deliver poor care. I’m not joking.

Action to prevent further incidents usually involves a defensive change agenda, aimed at stopping things from happening.  The quasi-legal investigative process, however, makes it difficult for any progressive, developmental changes to be implemented.

“The people directly or indirectly involved with an incident know exactly why it happened, but we make it difficult for them to speak out. And we exclude them from shaping the post-event change agenda. I mean it’s Change Management 101, Rule 1: if you want people to commit to change, get them involved in the change from the get go. But of course what happens is that following these events we exclude those with the best knowledge, we exclude those who will have to work with the changes, we delay things for six months while the investigation unfolds, and then someone else tells them what’s going to happen. Would you be committed at that stage?”

Luckily, there is progress being made when it comes to supporting those involved, although it doesn’t address the problems of forming a coherent narrative of the extreme event and ensuring progressive recommendations can be implemented.

“A new initiative called Schwartz Center Rounds brings together staff who have been involved in extreme events – and it’s multilevel, so will include doctors, nurses, porters – and brings them together so they can talk about it, how they feel, what happened.  This is a safe, confidential setting, the conversation is not recorded, and the process is also designed to be cathartic.”

This type of approach, relatively unheard of in the private sector, has been contributing to aviation safety for many years. Before confidential reporting (without fear of disciplinary action) was introduced, pilots and aircrew who experienced safety events such as near misses would not tell anyone in case they got fired or disciplined. The new approach to open reporting has undoubtedly contributed to the safety record of the aviation industry.

“If you have a near miss you tell someone. It may not be your fault, it may be that the signals aren’t accurate, it could be anything, but you tell someone. Basically, the airline industry has accepted the fundamental fact that system errors are often to blame for extreme events. Now if you accept that the causes of extreme events are for the most part systemic, then you needs a systemic solution. Which is why firing individuals or prosecuting them is really not a sensible approach.

“Imagine if they tried to criminalise pilots who caused harm to passengers. It’s unthinkable. Once you criminalise these things people shut up. They shut down. Bad things happen again.”

And reoccur they do, reminding me of a famous quote from Einstein – “Insanity is doing the same thing over and over again and expecting different results.” I’ll let David explain this one.

“My colleagues studied a year’s worth of serious incidents in a hospitals that we were working with. In every case the same seven or eight factors cropped up in explaining why these serious incidents occurred, for example poor information, inadequate processes, ineffective monitoring, lack of expertise, no reporting to senior management. And case in point, we had a serious incident and the manager said, “well we should do an investigation,” and my colleague turned to him and said, “why bother? Just read the report of the last incident to see what went wrong this time.”

“And I haven’t read it, but if you read the report into Daniel Pelka’s death it probably contains the same storyline as what happened to Victoria Climbié 15 years ago. Next time something awful happens to a child, why bother with an investigation? Just read the report from last time and change the names.” (Professor Buchanan added that, of course, there are differences, but the systemic reasons remain the same. I haven’t read the report either, but I’m inclined to agree with him).

Ultimately, we need to wake up and see extreme events for what they are – systems errors that give us a valuable opportunity to redesign the system itself. Until we do, heads will roll, but nothing will change.

And the people involved – traumatised, shocked, and the only people who can really explain what happened – will get side-lined, scapegoated and fed to the wolves.

One Response

Author Profile Picture
Jamie Lawrence

Insights Director

Read more from Jamie Lawrence
Newsletter

Get the latest from HRZone

Subscribe to expert insights on how to create a better workplace for both your business and its people.

 

Thank you.