SEARCH

SEARCH BY CITATION

“Mistakes were made”– Ronald Reagan

27 January 1987 [1]

“I screwed up”– Barack Obama

3 February 2009 [2]

Medical trends and their associated jargon come and go. Audit, clinical governance and evidence based medicine take us through the last 2–3 decades and each has provoked much discussion, debate and on occasion, bitter argument. A phrase that seems to be everywhere in the last year or two is ‘human factors’; what distinguishes this one from previous battle cries is that there are few voices raised in opposition, amid increasing acceptance that humans are not naturally suited to the modern chaotic, and increasingly pressurised, environment.

The term ‘human factors’ takes in the science of human behaviour, particularly applied to interactions between humans and other elements of a system, and how they can (or can’t) be adapted to improve performance and safety – recognising that blaming individuals for merely exhibiting human characteristics such as the propensity to error is counter-productive [3]. With an envious eye on the airline and nuclear industries, where there is a long tradition of attention to human factors going back decades [4, 5], those who organise and work in the medical field are starting to catch up, albeit in fits and starts. Thus in the UK in recent years, the Chief Medical Officer and Department of Health (DoH) have repeatedly called for attention to human factors and the need to learn from other industries to prevent failings in the NHS [6–8]1– apparently with good results in terms of improving the ‘safety culture’ and moving away from blaming individuals [9] (though those at the coalface may beg to differ [10]).

And so we see meetings and seminars specifically covering human factors in anaesthesia, along with sessions and presentations devoted to the topic at general meetings, and a ‘virtual issue’ of Anaesthesia entitled ‘Safety and human factors’ was compiled last year [11]. Media attention has long been attracted by human factors in medicine (though usually referred to as medical error or blunder), but the recent publication of the study assessing the World Health Organization’s (WHO’s) surgical checklist [12] – largely based around human factors – was accompanied by massive and widespread media coverage, along with a National Patient Safety Agency alert directing every healthcare organisation in England and Wales to implement the checklist ‘for every patient undergoing a surgical procedure’ by February 2010 [13].

One benefit of the WHO checklist is that it requires members of the team to know each others’ names and roles as an aid to communication, as well as ensuring that certain procedural elements of the surgical task are not forgotten – perhaps the most immediately obvious purpose of a checklist [14]. It is said that the concept of a pilot’s checklist was adopted after a disasterous test flight of the Boeing Model 299 (‘Flying Fortress’) in 1935, when it was realised that it was beyond human capability to memorise all the tasks required to fly a machine so complex [15]. The parallels between anaesthesia and aviation have been further highlighted by the pilot Martin Bromiley’s harrowing account in the Royal College of Anaesthetists’ Bulletin of his wife’s death under anaesthesia [16], while the organisational differences that act to prevent our specialty from coming up to speed compared with the aviation industry are summarised by David Gaba, pioneer of anaesthetic simulation, in the same publication [17]. Both plead, as others have, for a systems approach to medical error and a focus on human factors.

It’s one thing to talk about systems and human failings, but another to implement the safety culture, for which so many are calling, into our daily workplace. Perhaps one reason for this is that focusing too much on systems and organisations can detract from the responsibilities of individuals within those structures. The move from the admission of President Reagan to that of President Obama is a welcome one – albeit one that has taken 20 years – and reflects the understanding first, that individuals can and will make mistakes, and second, that the first step in dealing with the consquences is to own up. When it comes to medical errors, though, there is a danger that ‘I blame the system’ may become a ready excuse, ringing as hollow as when a thug kicks down a sapling and claims in his defence ‘I blame society’. Each of us has a duty to work towards improving patients’ safety at organisational and professional levels, by involving ourselves in appropriate activities within our hospitals and directly or indirectly supporting national initiatives such as the WHO checklist. We should by now all be aware of our duty to keep our skills and knowledge up-to-date, as reflected in the need for continuous professional development/education. However, we also have a duty to address our own inherent frailty by increasing our understanding of, and learning tools to minimise, those aspects of human behaviour that contribute to medical failings.

Chief amongst these are fixation errors, a form of disorded situation awareness in which one fails to revise one’s current mental model according to the available information, instead distorting the latter so that it ‘fits’. Thus a person may fail to consider the one problem that is actually present or the solution that is required despite evidence to the contrary (‘everything but that’), he/she may persist with a single diagnosis or plan despite evidence that it is wrong (‘this and only this’); or he/she may continue to believe that there is no problem despite a worsening situation (‘everything is OK’) [18, 19].

Fixation occurs all around us and anaesthetists will be familiar with some ‘classic’ examples: the surgeon who insists that the patient is not bleeding despite clinical signs to the contrary; the anaesthetist who continuously fiddles with the oximeter probe rather than address the worsening hypoxaemia; and the ward nurse who faithfully records worsening vital signs on the observation chart as the patient slips into coma, unnoticed. Fixation can also be more subtle than this; as an example consider a sequence of (true) events that occurred in my hospital after the infusion line of a patient controlled epidural analgesia (PCEA) pump was accidentally connected to the patient’s intravenous cannula, fortunately with no adverse sequelae (Box 1).

Box 1

  1. Top of page
  2. Box 1
  3. Competing interest
  4. References
  • • 
    Trainee X discovers the incorrect connection and informs Consultant A, on-call that evening.
  • • 
    The next morning Consultant A hands over the case to Consultant B, who initially misunderstands, for the first few minutes thinking that the intravenous infusion had been connected to the epidural rather than the other way round.
  • • 
    Consultant B informs Trainee Y, who inserted the epidural, of the incident, taking care to be precise because of the misunderstanding that morning. Despite this, Trainee Y makes the same misinterpretation of the facts, initially understanding them to mean that the intravenous line had been connected to the epidural.
  • • 
    Later that day, Consultants A and B are discussing the case, and Consultant B mentions that both he and Trainee Y had misinterpreted the facts being told to them, and how that may have been a form of inherent anaesthetic fixation to focus on the wrong drug being injected epidurally, rather than intravenously, despite the facts provided. Consultant A then reveals that when first informed of the case by Trainee X the previous evening, he too had misinterpreted the information he’d been given in exactly the same way.

Those who teach using simulators see fixation errors time and time again [20, 21], yet despite being so common, fixation error in medicine is surprisingly infrequently studied. In this issue, Fioratou et al. [22] describe a series of experiments in which the readiness of humans to become fixated is explored. Admittedly, set in the psychology laboratory rather then the operating theatre, and based around a simple puzzle rather than the complex situation of anaesthetising a patient, Fioratou et al.’s paper gives us a fascinating insight into how people think – and fail.

It is generally assumed that one can improve one’s susceptibility to fixation error by increasing one’s awareness of the risk, becoming more ‘situationally aware’, and habitually stepping back to question and/or revise diagnoses and plans in order to consider all alternatives [23]. Simulator instructors will be convinced of the value of simulation training in achieving these aims, although supporting data are few [24]. Fioratou et al. also suggest how other members of the team can help, and how we must be receptive to such assistance. Time will tell whether Fioratou et al.’s work can be extended to a more realistic (perhaps even real) anaesthetic setting, and whether strategies for preventing and dealing with fixation errors can be explored further. For now, we should all allow ourselves to become fixated on fixation.

Competing interest

  1. Top of page
  2. Box 1
  3. Competing interest
  4. References

I am an instructor in the Simulation Centre, Centre for Clinical Practice, Chelsea and Westminster Hospital, and have given evidence in medicolegal cases where fixation error has been a feature. I am also Consultant B in the scenario given above.

Footnotes
  • 1

    In an ironic demonstration of human error, the DoH’s report of 2006 [6, p13] gives the incorrect title when referring to its 2000 report [7].

References

  1. Top of page
  2. Box 1
  3. Competing interest
  4. References