Nimrod XV230: Parallels with healthcare

In my second post related to The Nimrod Review I consider how an industry seemingly unrelated to the defence sector, healthcare, has parallels with the Nimrod XV230 disaster. Hopefully, this discussion will help to illustrate that the organisational lessons from the Nimrod event are applicable to almost any industry. Please see my article for a summary of the Nimrod incident.

Just like the defence sector, healthcare can certainly be described as a complex system – a combination of large organisations, employing a multitude of technical specialists, working as individuals and in (sometimes ‘loose’) teams, often using cutting-edge technology and engaged in dynamic situations. Healthcare teams are not just operating or maintaining complex equipment and processes, they are ‘maintaining’ other human beings, who may have complications unrelated to their treatment. Healthcare is an interrelated combination of research, equipment design, procurement, diagnosis and treatment, often by surgical intervention or the prescription of medicines. In fact, it’s hard to imagine a more complex system. This is a system to which we have all been exposed and will, at some point in our lives, rely on heavily.

The increased application of human factors in healthcare is often attributed to the 2000 report To Err is Human, by the US Institute of Medicine. In this key work, human errors in medicine are discussed and a roadmap toward a safer health system is proposed. It might first appear that unlike a disaster that kills or injures dozens of people in a single event (such as in mining, offshore exploration, aviation or rail transport), failures in the healthcare system tend to impact on individual patients. Harm to patients is usually documented on a case-by-case basis, and there is some debate as to the number of these cases.

However, the same organisational or systemic failure(s) may lead to an adverse event for hundreds of patients. Unlike major events like Piper Alpha or the Tenerife airport disaster, they don’t all occur at the same point in time or at the same geographical location. For example, the Kirkup Report of the Morecambe Bay investigation (March 2015, p.7) concluded that the seriously dysfunctional nature of the maternity service at Furness General Hospital” led to the unnecessary deaths of mothers and babies over many years. I was pleased to read that the Kirkup Report discusses, in several places, ‘organisational failures’ – and it is these broader failures that are discussed in The Nimrod Review. 

‘System’ problems have been revealed in other high-profile healthcare events, for example, the Healthcare Commission investigation into the apparent high mortality rates in patients admitted as emergencies to Mid Staffordshire NHS Foundation Trust concluded that “there were systemic problems across the trust’s system of emergency care” (2009, p.4). The subsequent Francis Report (2010, p.12) also refers to organisational failures when it concluded that “There was evidence of unacceptable standards of care as a result of systemic failings. What has been shown is more than can be explained by the personal failings of a few members of staff”.

  • Qu.1 – Do your investigation processes get to the organisational or systemic causes?

Organisational change

Behind many of the issues in the Nimrod event, there lies a systemic problem. Over a period of many years, the UK Ministry of Defence (MOD) underwent what is described as a ‘tsunami’ of changes. I can’t help but draw parallels with the healthcare sector around the world; which, like the MOD, appears to be in a process of constant structural change. Healthcare providers have merged, policies have been amended, regulatory bodies have merged or changed and services have been relocated. In the same way that The Nimrod Review speaks of the MOD suffering from ‘deep organisational trauma’, I noted that the Kirkup Report discusses serial restructuring, a ‘level of organisational chaos’ and organisational turmoil in healthcare providers.

Poorly-managed organisational change can lead to a range of outcomes, such as insufficient staffing levels, increased workload, a reduction in supervision, loss of experience, unclear responsibilities, uncertainty, confusion and conflicting priorities. These effects, either individually or in combination, have the potential to impact on the quality of patient care. The impacts from organisational change may not always be immediately apparent.

In a complex industry such as healthcare, organisational changes can create new ways for the system to fail.

At Mid Staffs, a substantial reduction in nursing staff (when the hospital already appeared to be under-staffed) was implemented without an effective risk assessment of the impact of these organisational changes on the quality of care provided to patients. Many of the outcomes of poorly-managed change listed above occurred as a result.

The objective of organisational change is often the implementation of cost-saving measures, which can have a direct impact on the quality of patient care (such as a reduction in clinical staff). However, there will also be indirect impacts as resources required to implement these changes are directed away from clinical care and towards actually ‘delivering’ the changes and savings.

In the same way that the policies, decisions and actions of the MOD had an impact on individuals throughout the hierarchy; changes dictated by the Department of Health can have far-reaching impacts. Organisational changes at all levels of the healthcare system, from national government to individual departments, should be assessed for their potential impacts on patient care.

  • Qu.2 – Are organisational changes in your facility or department formally assessed for their potential impact on the management of patient safety?
  • Qu.3 – Do you assess the cumulative impact of a succession of changes?

Learning from warning signs and previous events

As I read the Kirkup Report, I saw parallels with the history of events before the Nimrod XV230 disaster: “The 2008 incidents were treated as individual unconnected events, and no link was made with previous incidents” (Kirkup Report, 2015, page 8) – a finding that could have been taken directly out of The Nimrod Review, which reported that there were a number of previous incidents and warning signs that should have been ‘wake up calls’. In relation to the Morecambe Bay investigation, I’d go as far as to suggest that the Kirkup Report could have been subtitled ‘A story of missed opportunities’.

In the inquiry into obstetric and gynaecological services at King Edward Memorial Hospital, Australia (Douglas Inquiry, 2001), it was similarly concluded that the clinical problems were long-standing, recurrent and widely known.

Organisational structures in complex organisations such as the MOD or the health service can make sharing and learning harder. For example, the Public Inquiry into children’s heart surgery at the Bristol Royal Infirmary described a large number of clinical directorates as ‘silos’, which did not effectively communicate with each other. This structure made it more difficult for the organisation to learn from warning signs.

  • Qu.4 – Do you join the dots between previous incidents, even though they may appear to be unrelated at first?
  • Qu.5 – Do you treat near-misses or reported errors as warnings, and act on them appropriately?

Priorities

Clearly in the MOD, there was a shift in priorities from safety and airworthiness, towards business and financial targets. This finding of The Nimrod Review bears a striking similarity to the conclusion of the Healthcare Commission in its investigation of Mid Staffs., which stated that:

“many senior doctors whom we spoke to considered that the trust was driven by financial considerations and did not listen to their views”

2009, p.8

In the MOD, a shift in culture was largely driven by countless reorganisations and the focus on cost-saving measures. In the UK National Health Service (NHS), there has been a drive for healthcare providers to become ‘foundation trusts’, which frees them from central Government control and enables them to retain any surpluses they generate, or even to borrow funds for investment. There was evidence in the Francis Report that management thinking was dominated by financial pressures and that the pressure to ‘break even’ was huge. There is a significant potential for the leadership of these organisations to become preoccupied and lose sight of their primary aim – to provide quality care to patients.

By their very nature, it’s difficult to discuss organisational issues individually, for they are often inter-related, such as the three failures of Leadership, Culture and Priorities that together sum-up The Nimrod Review. In relation to these three failures, the Francis Report (2010, p.184) makes disturbing reading:

“I found evidence of the negative impact of fear, particularly of losing a job, from top to bottom of this organisation. Regrettably, some of the causes of that fear have arrived at the door of the Trust from elsewhere in the NHS organisation in the form of financial pressures and fiercely promoted targets”.

  • Qu.6 – How would you demonstrate that patient safety is a priority?
  • Qu.7 – Are your most prominent messages about budgets and financial targets?

Paper-based safety versus reality

I often refer to a quote from The Nimrod Review that “There has been a yawning gap between the appearance and reality of safety” (p.579). Various systems and mechanisms to detect failures were in place, however, they failed to identify that the huge amount of documentation claiming  all was well, was in fact just a paperwork exercise. Similar comments were made in the investigation into the fires and explosion at Buncefield, where the UK HSE concluded that the safety report and the safety management systems did not reflect what actually went on at the site.

Unfortunately, in the investigations into healthcare events it can often be seen that there existed an elaborate system of overview and scrutiny, involving several organisations. However, these organisations (including healthcare regulators) were unable to identify significant failures in the healthcare providers over which they provided oversight. In fact, the Healthcare Commission investigation of Mid Staffs recommended that “Trusts to ensure that systems for governance that appear to be persuasive on paper actually work in practice” (2009, p.11).

  • Qu.8 – Do you know whether there is a difference between how activities are described in your procedures and how they are actually completed in practice? (Often described as a gap between ‘work as imagined’ and ‘work as done’).
  • Qu.9 – Do you know whether your processes for oversight are working as planned?

Lessons on Safety Cases

There have been discussions as to whether the concept of a ‘safety case’ would be helpful for the healthcare sector. There is no doubt that if implemented correctly, a safety case regime can add significant value. My own experience as a health and safety regulator in the oil, gas, chemical and railway industries is that this isn’t always true.  It’s not unusual for these safety reports or cases to be produced by a contractor with little knowledge of the facility or organisation. Sometimes, safety cases were copied from one site to another, without acknowledging key differences. It often seemed as though the safety case was produced for the benefit of the regulator, rather than as a useful tool for the company.

Safety cases should be live documents and their production should be an opportunity to challenge the status-quo, not simply document it. Here’s what The Hon. Mr Justice Haddon-Cave thought of the Nimrod Safety Case (although he suggests calling them Risk Cases going forward):

“Unfortunately, the Nimrod Safety Case was a lamentable job from start to finish. It was riddled with errors. It missed the key dangers. It was essentially a ‘paperwork’ exercise. It was virtually worthless as a safety tool. The defining features of the four years it took to produce the Nimrod Safety Case are high levels of incompetence, complacency and cynicism by the organisations and key individuals involved” (Nimrod Review, p.259).

There is much that the healthcare sector can learn from other industries before embarking on a safety case approach.

  • Qu.10 – Do you have the capability to understand and challenge technical work undertaken by contractors or independent advisers?
  • Qu.11 – Do you outsource your thinking?

Leadership and culture

In the Nimrod disaster, ‘the fundamental failure was a failure of Leadership’.  Inadequate leadership and poor organisational culture have been recurrent features in major incidents around the world, such as Buncefield, Texas City, the NASA Shuttle incidents, Macondo and Fukushima. These issues have also been highlighted in healthcare inquiries, such as that into children’s heart surgery at the Bristol Royal Infirmary and the Mid-Staffs. inquiry into mortality rates in patients admitted as emergencies. The Kirkup Report found that deep-rooted problems of organisational culture blighted the maternity unit for years.

The Nimrod Review found that leaders failed to ensure the integrity of the Safety Case process, failed to identify or act on warning signs, failed to maintain an appropriate safety culture and allowed financial aspects to take priority. Leadership begins at the top of the organisation. In the healthcare sector this includes the CEO and Executives, Board members, senior clinical leaders and department heads.

Leadership is not just management. Whereas management can be defined as ‘doing things right’, leadership is more about ‘doing the right things’.

Leaders must challenge the organisation and drive continuous improvement in the culture. Leaders set the tone of the organisation.

Good healthcare leaders understand that ‘medical errors’ and other adverse patient safety events are not simply due to careless staff, but result from organisational and system failures. For the quality of patient care to be improved, leaders wishing to learn from the Nimrod incident should ensure that patient safety is a strategic priority for all staff – and provide the resources to make this a reality. Leaders should foster a culture that supports the reporting and discussion of errors where adverse events do occur, so that lessons can be learned. I’m reminded of a recommendation from the Mid Staffs. investigation – “Develop and promote an open, learning culture” (2009, 12).

Good patient safety leadership does not happen by chance.

  • Qu.12 – Is patient safety on the agenda of your senior leadership or Board meetings?
  • Qu.13 – How does your Executive team create a ‘patient safety’ culture?
  • Qu.14 – Do senior leaders do ‘walk-arounds’ and speak to a range of staff about patient safety issues?

Conclusion

From the above (which I don’t claim to be a full analysis of the issues), it is clear that the defence industry and healthcare are more similar than may be first thought. The key lessons from Nimrod XV230 apply to healthcare for the simple reason that Nimrod XV230 was not purely due to technical failures. There were systemic failures of leadership, culture and priorities. These organisational issues apply equally well to the healthcare sector. Can it happen here? Of course the specifics of the Nimrod incident will not be repeated in the healthcare setting; but it’s likely that in the next major inquiry in healthcare, failures of leadership, culture and priorities will play a significant role. Does it have to be that way?

I’ll finish this post with the words of The Hon. Mr Justice Haddon-Cave:

“Many of these lessons and truths may be unwelcome, uncomfortable and painful; but they are all the more important, and valuable, for being so. It is better that the hard lessons are learned now, and not following some future catastrophic accident” (Nimrod Review, p.580).

Comments are closed.

Up ↑

Discover more from Human Factors 101

Subscribe now to keep reading and get access to the full archive.

Continue reading