It’s not what you think . . .
In several recent incident reports, I have seen the phrase ‘Situation Awareness’ listed as a cause; or more specifically, a “loss of situation awareness” – by a pilot, air traffic controller, physician, process operator etc. This term has become overused and misused; it’s often a shortcut to saying that someone didn’t pay enough attention, or was careless. In some cases this has unfortunately led to inappropriate blame and punishment, at the expense of real learning. “Loss of situation awareness” has almost become a different way of saying ‘human error’. As the late, great Trevor Kletz said – concluding that incidents are due to human error is about as useful as stating that falls are due to gravity. The same could be said of situation awareness.
The term originated in the aviation sector, but has since been used in the defence, healthcare, space, mining, and oil and gas industries. It is a key human factors topic.
In this article I’ll define Situation Awareness and clarify what it is, what it isn’t, why it is a useful concept and how it can be misused.
I will also outline what often lies behind situation awareness, highlighting some of the underlying factors that may have contributed to a loss of situation awareness in significant incidents.
Situation awareness and Non-Technical Skills
I’ve written elsewhere about Non-technical Skills, the interpersonal, behavioural and cognitive skills that complement our technical skills. In safety critical industries, technical skills are obviously necessary, but not sufficient on their own. Non-technical skills are often not taught, at least not to the same extent as technical knowledge and technical skills. In some industries, Non-Technical Skills are known as Crew Resource Management (CRM).
Situation awareness is considered to be one of the most essential non-technical skills. Most training courses in non-technical skills include a module on situation awareness.
Defining Situation Awareness
Not paying enough attention to our surroundings is part of situation awareness, but it is not the complete picture. There are as many definitions as there are commentators in this area, but the one that I propose to use here is:
Developing and maintaining a dynamic awareness of the situation and the risks present in an activity, based on gathering information from multiple sources from the task environment, understanding what the information means and using it to think ahead about what may happen next.
This definition refers to reflecting on the past, present and future. It includes the key aspects of information processing – from perception, through interpretation to prediction. The reason for using the above definition is that it contains the five aspects that I think are key to this concept. Let’s break it down:
- Dynamic: Situation awareness is not a one-off or a snapshot, it is a continuous process, hence the ‘dynamic awareness‘ phrase in the above definition. If what we expect to happen does not align with what is actually happening in the real world, we may check our data, gather new information, and revisit our decisions. This will help us to ‘regain control’. Situation awareness is maintained by continually checking facts against our understanding. Often, we use our expectations to influence how attention is directed, how information is perceived and how it is interpreted.
- Perception or gathering information by using our senses of vision, hearing and touch. In many complex systems, people perceive the state of the system indirectly, through displays and interfaces rather than through direct observation.
- Understanding information by combining this data from the real world with existing knowledge and experience from memory (and through this process, creating mental models and story-building). Information gathered is given meaning. This includes developing an accurate and complete picture of the world, informing our decisions. Comprehension is formed by putting two and two together to get four.
- Prediction and projection into the future, which includes thinking ahead, asking ‘what if?’, updating mental models and anticipating the future state of our environment. This involves predicting what to expect (as well as what not to expect).
- Risks: The above definition also includes awareness of the current risk, and an evaluation of the future risk.
Situation awareness is therefore about proactively staying ahead of the situation. It is strongly related to decision making – it determines whether a good decision can be made. It is critical to understand the bigger picture, before making decisions. Degraded situation awareness can lead to inadequate decision making and inappropriate actions. When situation awareness is lost, even momentarily, people may be slower to detect problems and require extra time to reorient themselves. This is a greater concern where timely decisions are required. In fact, part of situation awareness is an understanding of how much time is available until an event occurs, or some action to be taken.
Mental models: these are the road maps that we use to interpret and make sense of the world.
These models drive the search for data and also help us integrate that data into meaningful assessments.
These processes of perception, understanding and prediction do not occur in a vacuum, they are heavily influenced by our preconceptions and expectations; and by the demands placed on our “information processing system” (for example, see the discussion of Working Memory).
Note that it is not necessary to progress through the three stages (Perception, Understanding and Prediction) in a linear or sequential fashion. For example, a person who understands the current situation has better situation awareness than someone who reads data on a screen but doesn’t understand what it means.
As mentioned above, gaining and maintaining situation awareness is an iterative process – people may actively look for data that confirms or denies their assessment of a situation; or seek data that fills gaps in their knowledge. Situation awareness is a dynamic combination of incoming data, processing of that data and seeking new data. People are not simply passive receivers of information, but take an active role in determining what is relevant. They may, for example, choose which information is displayed on control panels and in what format. Our awareness is almost constantly updated – as the situation changes, so must our situation awareness.
When we are unable to find relevant data (or we just don’t have the time), we may use assumptions, previous experience or ‘default’ data. Sometimes, a bias may be introduced at this stage; which can lead to errors of judgment.
Is Situation Awareness a process, or the end state?
There is some confusion about whether the term situation awareness refers to either (1) the processes that people use to gather information and understand their world; or (2) the end-state that is derived from those processes. A key author in this area provides the following distinction:
“It is necessary to distinguish the term situation awareness, as a state of knowledge, from the processes used to achieve that state. These processes, which may vary widely among individuals and contexts, will be referred to as situation assessment or as the process of achieving, acquiring, or maintaining situation awareness”.Endsley, 1995, p. 36
Situation awareness is therefore a product of the processes used to achieve and maintain it; and this product also affects those processes.
Sensemaking is the process of understanding the information and situations in which people find themselves. They are two key differences between this and situation awareness.
- Sensemaking is largely a conscious and deliberate process, akin to story-building. The process of gaining situation awareness can sometimes be deliberate, but is often fast, reflexive and highly automatic.
- Sensemaking is often applied retrospectively, in trying to understand organisational accidents (i.e. why something disastrous or tragic happened). It is generally backward-looking, whereas situation awareness is forward-looking (particularly the prediction and projection processes).
Team Situation Awareness
Although early research into situation awareness focused on individuals, much work has also considered teams. Team situation awareness can be defined as “the degree to which every team member possess the situation awareness needed for his or her job” (Endsley, 1995, p.39). ‘Shared Situation Awareness’ is a similar concept, where in addition, each team member has the same shared situation awareness where it is needed for their role (i.e. not sharing everything, only that which is necessary when goals overlap).
This shared team awareness is the extent to which the personnel involved have a common mental image of what is happening and an understanding of how others are perceiving the same situation. Situation awareness in these models has two parts: a person’s own knowledge of the situation and their knowledge of what others are doing (and might do if the situation were to change in certain ways). In addition to mental models, expectations and previous experience, the sharing of information between team members is key to the Understanding phase of situation awareness.
Losing situation awareness
A brief definition may be that “If it feels wrong then it probably is” – but we need something a little more scientific than that. Situation awareness is the understanding of what is happening now, and given that information, what may happen in the future. Given this, there are some clues that situation awareness is becoming degraded:
- Fixation on one thing to the exclusion of everything else
- Poor communications, such as vague or incomplete statements
- Not following established procedures
- Future states that were expected do not materialise
- Not having the ‘time to think’.
Situation awareness may be lost because of fatigue, distractions, stressful situations, high workload, vigilance failures, poorly presented information, forgetting key information and poor mental models. Optimising these (and other influences on human performance) is central to the human factors approach.
The result of losing situation awareness (or having an inadequate awareness) may be poor decision making, risk-taking and other unsafe behaviours.
Situation awareness in major incidents
In the case of the Texas City refinery explosion on 23 March 2005, 15 workers were killed and 180 injured when a column was overfilled on start-up. The control room operator experienced some difficulty in maintaining an accurate awareness of the situation while monitoring a complex, fast-moving environment. Shift handovers were rushed, logbook entries were vague, human machine interfaces were poorly designed, the workload was high (partly due to insufficient staffing levels) and operators were likely fatigued.
At the Buncefield fuel storage depot on 11 December 2005, a storage tank was overfilled and a large quantity of petrol overflowed, resulting in a vapour cloud that ignited. The plant operators failed to recognise that the tank was overfilling. Similarly to the Texas City event, operators were set up to fail by inadequate human machine interfaces, for example, there was no ability to see the contents of several storage tanks at once. The Buncefield depot received fuel by pipeline from three UK refineries and sometimes the flow rates in these pipelines were changed without the knowledge of the Buncefield staff. As with Texas City, these plant operators were likely fatigued due to the shift system and significant overtime. Together with poor shift handovers, these factors all combined to reduce the situation awareness of those staff who were supervising the movement of fuel between storage tanks.
In the Deepwater Horizon (Macondo) incident, the drilling crew held an inaccurate mental model of the developing situation. This was partly fed by erroneous assumptions, inaccurate mental models and ultimately leading to inaccurate situation awareness of the well conditions. The crew had a strong expectation that the Negative Pressure Test would be successful. Several factors influenced the crew’s decision-making (such as fatigue, distractions, lack of experience, time pressure, lack of clear procedures and social pressure). This case study clearly illustrates how an inaccurate mental model can influence the information that is sought, influence the interpretation of that information and thus feed the inaccurate mental model. Confirmation Bias certainly played a role in this disaster.
Incident investigations: A health warning
There is some debate as to how useful the term ‘situation awareness’ is within incident investigations. Dekker (2013) warned that care must be taken in the use of this term, and stated that:
“‘loss of situation awareness’ is analytically nothing more than a post hoc judgment that says we know more about the situation now than other people apparently did back then”Dekker, 2013
There is a danger that a loss of situation awareness becomes a convenient explanation in accident investigations; when in fact, on its own, it explains very little. The use of this term should be a prompt to investigate further. If we’re going to use this concept in our incident investigations, we first need to understand why situation awareness was ‘lost’, in order to gain a richer explanation of what went wrong and why. In the above three investigations (Texas City, Buncefield and Macondo), the investigators took great care to understand the reasons behind inadequate situation awareness.
In an investigation, it may be useful to examine each of the three main stages of information processing, for example:
- Gathering information: data was not observed, or data was not visible, perhaps due to a high workload, distractions, interruptions, or poor design of displays and interfaces.
- Understanding information: use of an incorrect or incomplete mental model, perhaps due to a lack of experience or knowledge, or a cognitive bias, such as confirmation bias.
- Thinking ahead: over-reliance on a mental model or failure to realise that the mental model was incorrect.
Only when the causes of the ‘lost’ situation awareness are understood can we make any conclusions about incident causes. For example, if an aircraft crashes into terrain, stating that the pilot lost situation awareness is not helpful. If further investigation showed that key data in the cockpit was not saliently presented to the crew; that finding can lead to tangible actions and physical changes in the cockpit to prevent similar incidents.
Supporting situation awareness through design
A key aspect of supporting people to build and maintain situation awareness is to help them to create an accurate mental model of the system. Having a clear mental model supports people in predicting how something is going to behave in the future. Good design, following human factors principles, promotes a more accurate mental model and improved situation awareness.
Good design supports people to answer these questions: What has happened? What is happening? What might happen?
The role of the designer includes helping users to direct their attention to the right places and provide a feed of information that helps the user to update their mental model of the situation. Involving end-users early-on will help designers to closely match the design with the users mental model of the system.
For example, if the designer of a control room presents unnecessary information to an operator, or floods them with too much information at any time, they will be unable to maintain situation awareness and decision-making will suffer as a result.
If a new facility is designed so that it can be operated remotely, (such as an offshore platform to be operated from onshore), how operators will obtain and maintain situation awareness should be an early consideration before the design becomes too mature.
Some design considerations include:
- Identify what information people will need and in what format
- Present information in a way that aids comprehension, such as making it clear how the current value differs from the expected value or future state (rather than simply presenting raw data)
- Presenting trends over time may help users to project the future state of the system
- Provide users with the ‘big picture’ rather than isolated subsets of information
- Reduce the mental calculations required
- Co-locate people who need to communicate key information regularly in order to maintain their situation awareness
- Reduce distractions and interruptions
- Match the salience of an attention-seeking device (such as an alarm) with its importance
- Make hazardous states highly visible
- Maximise the visibility of missing components following maintenance.
Acknowledgements to Mica R. Endsley for her early work in describing and defining Situation Awareness (including the often referred to “Endsley 1995 Model“). Thanks to her work, Situation Awareness has become a widely used construct in the human factors community, driving the development of advanced information displays and training in many industries.
Reference / Further reading
Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), pp. 32–64.