In this sixth post addressing issues arising from The Nimrod Review, I consider the relationship between complexity and system failures. RAF Nimrod XV230 suffered a mid-air fire shortly after air-to-air refueling whilst on a routine mission over Afghanistan almost 10 years ago on 2 September 2006. This fire led to the loss of the aircraft and the death of all 14 personnel on board.
The Nimrod Review concluded that an effective safety and airworthiness system (i.e. the fitness of an aircraft to fly) should be based on four key principles. One of those principles is ‘simplicity’.
In my previous post, I discussed the ‘tsunami’ of organisational changes in the Ministry of Defence (MOD) in the years prior to the Nimrod XV230 disaster. The product of such continuous organisational change was more and more complexity. However, a safe system is generally the opposite; safety usually arises from simplicity. The author of The Nimrod Review, The Hon. Mr Justice Haddon-Cave, states that simplicity is your friend and complexity is your enemy. I was honoured to be quoted in The Nimrod Review for my comment on the complexity of the NASA organisation in relation to the Shuttle incidents:
“NASA was so complex it could not describe itself to others (Martin Anderson, HSE, 2008)” (Nimrod Review, p. 492).
When we build more complexity into systems, we introduce more potential interactions between components. As many ‘systems thinking’ authors point out, accidents in complex systems often result not from single component failures, but from the dysfunctional interaction of reliable and functioning components. So, accidents can ’emerge’ from system complexity. The more complex a system, the harder it is to predict how it will actually work in practice (and, how it will fail).
It’s possible to break down a complex mechanical system such as an aircraft engine (or even an offshore oil and gas platform) into its individual components, and understand how each part interacts and relates to the others. However, it is much harder, if not impossible, to do this with a complex organisation – often referred to as a sociotechnical system.
One of the world’s greatest authorities on chemical process safety, Dr. Trevor Kletz (1922-2013) published an article in 1978 entitled “What You Don’t Have Can’t Leak“. Kletz is often quoted as the father of inherently safer technology and processes. His writings on human error and accident investigation refocused industry’s emphasis away from individual failures to inherently safer design. Simplicity and inherent safety are considered to be complementary: eliminating unnecessary complexity can increase safety. As an everyday example, single-story homes are inherently safer than multi-level houses because they do not have stairs, which are a major cause of serious accidents in the home.
The Ministry of Defence (MOD), through many organisational changes over a prolonged period of time, became more and more complex. In combination with other factors, such as a shift in priorities (from delivering safety, to meeting financial targets), this complexity helped to pave the path to disaster:
“During the period 1998 to date, the MOD airworthiness regime suffered from an inexorable descent into the vortex of ever-increasing complexity and confusion” (Nimrod Review, p. 390).
My regulatory experience (UK HSE) is that even if individual organisational changes are assessed, the cumulative impact of a series of such changes usually isn’t considered. So, the new organisational model becomes the baseline against which any further changes are measured, rather than the original organisational design. This is a type of organisational drift (see my post on Normalisation of deviance). It is often unintentional, but some organisations purposely divide a significant organisational change into a series of smaller changes so as not to alarm the regulators.
The Nimrod Review describes the MOD airworthiness system as being of ‘Byzantine complexity’ and the MOD is said to have found false comfort in this complexity:
- roles and responsibilities were diffuse, diluted and opaque
- lines of authority were often conflicting and unclear, and not widely understood by others in the organisation
- fragmentation of airworthiness duties and responsibilities
- a collection of many disparate regulators, each responsible for different aspects of airworthiness and each having different levels of authority
- a huge volume of defence regulations that were virtually impenetrable
- equipment and operational risks were managed separately
- a myriad of different risk matrices to determine risk categories
- a large number and multiple layers of safety and airworthiness boards, meeting groups, committees, sub-committees and working groups.
Over many years, the responsibility for risk management was divided, dissipated and dispersed to the point that everyone was involved but no-one was responsible.
By understanding what complexity meant in practice for the MOD, you may be better informed to recognise unnecessary complexity in your organisation, site or department. Can you quickly and easily describe the different functions in your organisation, their purpose, and how they interact? Are you able to describe how critical processes work (who does what and when)? Can all of your colleagues do the same? You may not necessarily have to change the organisational structure or processes, it may simply be a case of mapping key processes and communicating to those who need to know.
If you’re planning or implementing a change to the organisation, will this increase complexity? Will roles and responsibilities for key functions and processes become blurred? Are you introducing unclear lines of communication? Please see my topic page for more tips on how to manage the impacts of organisational change.
Sometimes, less is more.
In a recent presentation, The Hon. Mr Justice Haddon-Cave quoted the British economist E.F. Schumacher: “Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius – and a lot of courage – to move in the opposite direction”.
Categories: human factors