Nimrod, Sept. 2006

The Nimrod Review - A failure of leadership, culture and priorities

A failure of leadership, culture and priorities

The Nimrod aircraft has served the UK Royal Air Force (RAF) since 1969. On September 2, 2006, RAF Nimrod XV230 suffered a catastrophic mid-air fire, shortly after air-to-air refuelling, on a routine mission over Afghanistan. This led to the loss of the aircraft and all 14 service personnel.

The immediate cause of the fire was aviation fuel coming into contact with a high temperature ignition source. The fire was in a part of the aircraft to which the crew had no access, nor was there any fire protection system.

An independent inquiry, led by Charles Haddon-Cave QC, concluded that the accident was avoidable and due to significant failures by the Ministry of Defence (MOD), BAE Systems (who produced the Safety Case) and QinetiQ (independent adviser). The report names and criticises ten individuals from these organisations for their role.

Warning signs for the future

As we have seen from several other major incidents (such as the NASA Space Shuttle disasters), in the MOD there were a number of previous incidents and warning signs, which the investigation describes as missed opportunities when they should have been wake-up calls. When incidents did occur, they were treated as isolated events, or ‘one-offs’, with little consideration that there may be patterns, trends or systemic issues. The wider implications of events were not considered (for example, the relevance of fuel leaks on Harriers and Tornados to other aircraft with similar designs, such as the Nimrod); which the Review calls ‘closed-loop’ thinking. For example, following a serious incident to Nimrod XV227 in November 2004, the implications for the Nimrod fleet (and the Nimrod Safety Case) were not considered.

Echoes of Columbia

I had the privilege of meeting with Sir Charles during the Nimrod Review, when we discussed the parallels between the loss of Nimrod XV230 and the loss of NASA Shuttle Columbia in 2003. These parallels are ‘organisational’ causes, which have also been observed in other major accidents (including the previous NASA Challenger disaster).

The deep-seated organisational causes that are common to both organisations are as follows:

  1. The ‘can do’ attitude and ‘perfect place’ culture.
  2. Torrent of changes and organisational turmoil.
  3. Imposition of ‘business’ principles.
  4. Cuts in resources and manpower.
  5. Dangers of outsourcing to contractors.
  6. Dilution of risk management processes.
  7. Dysfunctional databases.
  8. ‘PowerPoint engineering’.
  9. Uncertainties as to Out-of-Service date.
  10. ‘Normalisation of deviance’.
  11. ‘Success-engendered optimism’.
  12. ‘The few, the tired’.

The Nimrod Review, quite rightly, devotes most of a chapter to a discussion of the Columbia disaster and the similarities between it and Nimrod XV230, and between NASA and the MOD.

The safety case regime

Perhaps the most significant learnings for other high-hazard industries were around the Nimrod Safety Case. The purpose of a safety case (or a safety report in some industries) is to identify, assess and mitigate potentially catastrophic hazards before they cause an accident. This is the means by which organisations demonstrate firstly to themselves (and secondly to regulators) the safety of their activities. The first ‘safety cases’ in the UK were mandated in the Control of Industrial Major Accidents Hazards Regulations (1984), and this approach has since been adopted more widely, for example in nuclear, rail and aviation.

In the case of the Nimrod, a suitable safety assessment would have identified risks introduced by various modifications, including the air-to-air refuelling modification.

Had the safety case captured and controlled several inherent design flaws, the incident would have been avoided – and so here lie significant learning opportunities for all organisations with a similar safety case regime. The Nimrod Safety Case is described as a paperwork exercise and virtually worthless as a safety tool. It provided a false sense of security. It also diverted resources from actually managing safety and airworthiness. The consultancy fees for Nimrod Safety Case 2 were in excess of £3 million.

In the Ladbroke Grove Rail Inquiry (Part 2, 2009), Lord Cullen stated that “Safety Cases were intended to be an aid to thinking about risk, not an end in themselves”. The Nimrod Safety Case was undermined by a history of previous success and a widely-held assumption that the aircraft was ‘safe’. This weakened the integrity of the Safety Case, which became a documentary rather than analytical exercise and therefore failed to identify the potential for catastrophic fire before it was too late. These serious design flaws were able to lie dormant for decades.

The Nimrod Review pulls no punches on this matter:

“Unfortunately, the Nimrod Safety Case was a lamentable job from start to finish. It was riddled with errors. It missed the key dangers. Its production is a story of incompetence, complacency and cynicism. The best opportunity to prevent the accident to XV230 was, tragically, lost” (Nimrod Review, 2009, p.161).

The MOD contracted-out production of the Nimrod Safety Case to BAE Systems and simply accepted that they completed this task, with little review or challenge. The MOD did not fully test or question the contents of the Case – when in fact, it was incomplete and contained systemic errors. The Review states that it is unfortunate that the MOD and QinetiQ ‘did not ask some intelligent questions’ and that the monitoring of the outsourcing to BAE Systems was the ‘antonym of intelligent’. This is a good example of a failure to act as an ‘intelligent customer‘. As well as outsourcing the Safety Case, the MOD also ‘outsourced its thinking’.

What is ‘safety’ anyway?

One of my favourite concepts in the Nimrod Review is the description of the Nimrod Safety Case as an ‘archaeological’ exercise, locating design data and other historical documentation; rather than fresh analysis and challenge. This became a self-fulfilling prophecy. The Review discusses that the task was to demonstrate through such documentation that the aircraft was safe, rather than to look for gaps in safety. It’s difficult to say that complex systems are completely ‘safe’; much better to argue in a Case that hazards have been identified and risks controlled. The Review notes that on the day before the Challenger (1986) and Piper Alpha (1988) disasters, these systems could have been considered ‘safe’, based on an analysis of past incidents alone. . .

Documenting the past in this way is described as creating a ‘pointless’ Safety Case. It is easy to give the strong impression of safety, but not deliver it. This is one of the reasons for Sir Charles reaching the conclusion that:

“There has been a yawning gap between the appearance and reality of safety. The system has not been fit for purpose” (Nimrod Review, 2009, p.579).

So, why?

Failures such as the inadequate Nimrod Safety Case have been discussed above. The reasons for these failures include:

  • high workloads of the Nimrod team;
  • operational pressures;
  • reduced focus on safety;
  • budget pressures;
  • dilution of oversight;
  • fewer checks and balances.

‘Five whys’?

A deceptively simple incident investigation technique is to ask ‘Why?’ five times; each time delving deeper in the underlying causes. In this case, we could ask why the above conditions, such as workload, pressures and a reduced focus on safety, came to exist. The underlying causes of many of these factors is a topic discussed in some detail elsewhere on this website – organisational change. Below I’ll try to summarise the key points from the Review.

Organisational change

“The MOD suffered a sustained period of deep organisational trauma between 1998 and 2006 due to the imposition of unending cuts and change, which led to a dilution of its safety and airworthiness regime and culture and distraction from airworthiness as the top priority” (Nimrod Review, 2009, p.355).

The Nimrod Review outlines in detail the many organisational changes that had an impact on safety and airworthiness; so many in fact that they are described as a ‘tsunami’ of cuts and change (over 900 cost-reducing initiatives in the Defence Logistics Organisation). The ‘organisational trauma’ referred to in the quote above is reported to have stemmed from a Strategic Defence Review in 1998, which was more of a continuous process, rather than a discrete change.

This created a series of initiatives and intensified three organisational themes:

  1. A shift from a functional structure (e.g. engineering, logistics, finance) to project-oriented structures;
  2. Creating larger structures, such as tri-service ‘purple’ organisations (purple is the colour obtained from blending the uniform colours of the air, sea and land services);
  3. Outsourcing to industry many functions traditionally undertaken by uniformed personnel.

This created both a ‘business culture’ and a ‘change culture’ which diluted the previous safety and airworthiness culture, and considerable resources were directed to delivering these changes and savings. During the same period (~1998 to 2006) the operational demands increased significantly, yet the scale or tempo of required cuts and savings wasn’t adjusted to reflect these increasing demands.

In addition, the replacement date of the 30-year old Nimrod aircraft had been extended. The aircraft would likely need more attention and vigilance as its service life increased.

Clearly, more assurance, checks and balances, and a strengthened safety culture would be required during this process. The Nimrod Review concluded that significant changes to organisational structures or resources (such as the strategic goal of 20% cost-savings over 5 years) were not fully assessed for the potential safety and airworthiness implications. Two key issues are highlighted in the Review: the direct impact of cost-saving measures on safety and the indirect impact of time, attention and resources being redirected to delivering change and savings.

The product of such continuous organisational change was more and more complexity. However, a safe system is generally the opposite; safety usually arises from simplicity. Sir Charles states that simplicity is your friend and complexity your enemy (APPEA Conference, Perth, 8 April 2014). I was honoured to be quoted in the Nimrod Review for my comment on the complexity of the NASA organisation in relation to the Shuttle incidents:

“NASA was so complex it could not describe itself to others (Martin Anderson, HSE, 2008)” (Nimrod Review, 2009, p. 492).

Organisational change is seen by the Nimrod Review as a causal factor in the loss of Nimrod XV230. The following quote from the Review is worthy of repeating here in full, for it sums up the devastating impact that the organisational changes had on the MOD; and how these changes can be related to the loss of Nimrod XV230 in 2006:

“Change can be seriously inimical to safety and airworthiness unless properly planned, resourced and managed. It can lead to the organisational dilution of safety structures. It can lead to a diversion of resources from safety matters. It can distract attention from safety issues. It can lead to a shift in priorities. It can change the culture. In this case, it did” (Nimrod Review, 2009, p.368).

‘Airworthiness safety’ versus ‘personal safety’

Following several major accidents in the oil, gas and chemicals industry (most notably Esso Longford, 1998 and BP Grangemouth, 2000) there became an understanding that ‘major-hazard’ or ‘process safety’ required a specific focus, over-and-above traditional occupational health and safety. In my reading of the Nimrod Review, I saw parallels of this in the MOD – a failure to recognise that ‘aviation’ safety is different to general health and safety. In the same way that process safety has received more attention and recognition in recent years; airworthiness is now being seen as a specialised technical discipline.

Organisational failures

In a striking similarity to incidents such as BP Texas City, Shuttle Columbia and Macondo, the Nimrod XV230 incident was found to be due, not to failures of those at the sharp-end, but by failures of those remote from the incident in time and space. The quote below reminds me of early discussions of latent failures by Prof. James Reason.

“In this Report, I explain the manner in which the conditions for a major catastrophic accident can be created by lurking weaknesses, errors and omissions sometimes set in train years apart” (Nimrod Review, 2009, p. 23).

Some of the weaknesses discussed in the Nimrod Review include:

  • inadequate Safety Case regime (discussed above);
  • inadequate appreciation of the needs of aged aircraft;
  • weaknesses in the area of personnel and capability;
  • unsatisfactory relationship between the MOD and industry;
  • unacceptable procurement process;
  • a safety culture that does not work;
  • seeing airworthiness as part of safety generally;
  • too much change, not enough stability;
  • confusing lines of authority;
  • unclear roles and responsibilities (especially in relation to airworthiness);
  • failure to manage the needs of aging aircraft;
  • unclear and inaccurate understanding of hazards and risk;
  • poor understanding of the total risk picture;
  • inadequate investigation and trending;
  • voluminous and complex rules and regulations;
  • lack of corporate memory;
  • audits that focus on paper systems rather than real-world;
  • a system that has evolved rather than been designed;
  • insufficient airworthiness Leadership.

The way forward

The Review concludes that an effective safety and airworthiness system should be based on four key principles:

1. Leadership (from the top, for safety and airworthiness as overriding priorities):

“The fundamental failure was a failure of Leadership. As preceding Chapters have shown, lack of Leadership manifested itself in relation to the way in which the Nimrod Safety Case was handled, in the way in which warning signs and trends were not spotted, and in relation to inexorable weakening of the Airworthiness system and pervading Safety Culture generally. For these reasons, Leadership is a key principle for the future” (Nimrod Review, 2009, p. 491).

2. Independence (in relation to oversight, which should be independent from the pressures and conflicts of operations);

3. People (because safety is delivered by people, not just Process and Paper). This echoes current safety thinking that people are not hazards – people create safety;

4. Simplicity (so that everyone can understand regulatory structures, processes and rules).

Key recommendations

The Review provides many recommendations. I have summarised some of them here, many of which have wider application outside of the UK MOD:

  • for a new military airworthiness regime, requiring several structural, cultural and behavioural changes (to be implemented in a sensible and measured manner);
  • for an overhaul of regulations and management systems across the three Services;
  • enabling aircraft that are both safe to use, and used safely;
  • for a new accident investigation process, employing independent and experienced investigators; considering human and organisational as well as technical factors, and which can identify trends and learn lessons;
  • for safety cases (the Review proposed ‘Risk Cases’ instead) to be SHAPED: Succinct, Home-grown, Accessible, Proportionate, Easy to understand and Document-lite. Also, to clarify their purpose as “a reasonable confirmation that risks are managed to ALARP”;
  • greater consideration of the increasingly aged aircraft;
  • a whole range of Personnel and capability issues, such as valuing engineers and engineering skills (including ‘safety engineering’), intelligent customer capability, addressing shortage of manpower and reducing churn;
  • to examine relationships between the MOD and industry;
  • a new procurement strategy, for example, to counter cost over-runs and delays in new equipment;
  • building an engaged safety culture, bearing in mind the key role of leadership, based around the five elements: a Reporting Culture, Just Culture, Flexible Culture, Learning Culture and Questioning Culture;
  • briefing leaders in the lessons from major accidents such as ChallengerColumbia and Nimrod XV230.

Broader implications for safety Regulators

In 2010, I provided the human and organisational factors input into a working group tasked with understanding the broad implications of the Nimrod Review for the UK Health and Safety Executive (HSE), a Regulator of onshore and offshore major hazard facilities (in the oil, gas, chemicals and nuclear industries).

Martin Anderson (right) in the HSE Nimrod working group
Martin Anderson (right) in the HSE Nimrod working group

This incident was a reminder to industries with a safety case regime (and their Regulators) that paper documents don’t deliver safety – processes do. Safety cases shouldn’t be written by contractors and then shelved; they should be written to help companies understand the issues within their organisation. Safety cases should be accessible and easy to understand. And most of all, they should be used. HSE incorporated lessons from the Nimrod Review into guidance on safety management systems, and into its approach to regulating safety leadership. The question that we asked of the organisations we regulated was ‘can it happen here?’.

Will we ever learn?

“Many of these lessons and truths may be unwelcome, uncomfortable and painful; but they are all the more important, and valuable, for being so. It is better that the hard lessons are learned now, and not following some future catastrophic accident” (Nimrod Review, 2009, p. 580).

Sir Charles believes that the most fitting memorial to the crew of Nimrod XV230 will be that these lessons are truly learned and recommendations implemented. Personally, I would hope that such learning takes place in all high-hazard and complex organisations, not just the defence sector. The Nimrod Review should send ripples across all high-hazard and complex organisations.

More information on Nimrod XV230

NEW ! For more information and discussion on the Nimrod XV230 incident, see several of my articles here humanfactors101.com/tag/nimrod-xv230/

NEW !  Nimrod XV230 – Reflections on leadership, culture and priorities, my latest presentation on this event at the APPEA conference and exhibition, 15 May 2017, Perth, Australia.

The Nimrod Review: An independent review into the broader issues surrounding the loss of the RAF Nimrod MR2 aircraft XV230 in Afghanistan in 2006, Charles Haddon-Cave QC, First published on 28 October 2009, ISBN 9780102962659. I can’t begin to summarise the significant content across nearly 600 pages of this report in a couple of thousand words here; and so I’d urge you to refer directly to The Nimrod Review – or at least read the brief chapter summaries.

Lessons from the Nimrod Review A keynote presentation delivered by The Honourable Mr Justice Haddon-Cave at the 26th Hazards Symposium in Edinburgh, Scotland on 24 May 2016.

‘Leadership and Culture, Principles and Professionalism, Simplicity and Safety – Lessons from the Nimrod Review’, The Honourable Mr Justice Haddon-Cave. The video below was filmed at the Piper 25 event in Aberdeen, UK, June 2013 (on the 25th anniversary of the Piper Alpha disaster). A transcript of this speech at Piper 25 can be found here.

 

Leadership & Culture, Principles & Professionalism, Simplicity & Safety: Lessons From The Nimrod Review”, Transcript of a speech by The Honourable Mr Justice Haddon-Cave, APPEA 2014 Conference, Perth, 8 April 2014.