“Mistakes were made”– Ronald Reagan
27 January 1987 
“I screwed up”– Barack Obama
3 February 2009 
Medical trends and their associated jargon come and go. Audit, clinical governance and evidence based medicine take us through the last 2–3 decades and each has provoked much discussion, debate and on occasion, bitter argument. A phrase that seems to be everywhere in the last year or two is ‘human factors’; what distinguishes this one from previous battle cries is that there are few voices raised in opposition, amid increasing acceptance that humans are not naturally suited to the modern chaotic, and increasingly pressurised, environment.
The term ‘human factors’ takes in the science of human behaviour, particularly applied to interactions between humans and other elements of a system, and how they can (or can’t) be adapted to improve performance and safety – recognising that blaming individuals for merely exhibiting human characteristics such as the propensity to error is counter-productive . With an envious eye on the airline and nuclear industries, where there is a long tradition of attention to human factors going back decades [4, 5], those who organise and work in the medical field are starting to catch up, albeit in fits and starts. Thus in the UK in recent years, the Chief Medical Officer and Department of Health (DoH) have repeatedly called for attention to human factors and the need to learn from other industries to prevent failings in the NHS [6–8]1– apparently with good results in terms of improving the ‘safety culture’ and moving away from blaming individuals  (though those at the coalface may beg to differ ).
And so we see meetings and seminars specifically covering human factors in anaesthesia, along with sessions and presentations devoted to the topic at general meetings, and a ‘virtual issue’ of Anaesthesia entitled ‘Safety and human factors’ was compiled last year . Media attention has long been attracted by human factors in medicine (though usually referred to as medical error or blunder), but the recent publication of the study assessing the World Health Organization’s (WHO’s) surgical checklist  – largely based around human factors – was accompanied by massive and widespread media coverage, along with a National Patient Safety Agency alert directing every healthcare organisation in England and Wales to implement the checklist ‘for every patient undergoing a surgical procedure’ by February 2010 .
One benefit of the WHO checklist is that it requires members of the team to know each others’ names and roles as an aid to communication, as well as ensuring that certain procedural elements of the surgical task are not forgotten – perhaps the most immediately obvious purpose of a checklist . It is said that the concept of a pilot’s checklist was adopted after a disasterous test flight of the Boeing Model 299 (‘Flying Fortress’) in 1935, when it was realised that it was beyond human capability to memorise all the tasks required to fly a machine so complex . The parallels between anaesthesia and aviation have been further highlighted by the pilot Martin Bromiley’s harrowing account in the Royal College of Anaesthetists’ Bulletin of his wife’s death under anaesthesia , while the organisational differences that act to prevent our specialty from coming up to speed compared with the aviation industry are summarised by David Gaba, pioneer of anaesthetic simulation, in the same publication . Both plead, as others have, for a systems approach to medical error and a focus on human factors.
It’s one thing to talk about systems and human failings, but another to implement the safety culture, for which so many are calling, into our daily workplace. Perhaps one reason for this is that focusing too much on systems and organisations can detract from the responsibilities of individuals within those structures. The move from the admission of President Reagan to that of President Obama is a welcome one – albeit one that has taken 20 years – and reflects the understanding first, that individuals can and will make mistakes, and second, that the first step in dealing with the consquences is to own up. When it comes to medical errors, though, there is a danger that ‘I blame the system’ may become a ready excuse, ringing as hollow as when a thug kicks down a sapling and claims in his defence ‘I blame society’. Each of us has a duty to work towards improving patients’ safety at organisational and professional levels, by involving ourselves in appropriate activities within our hospitals and directly or indirectly supporting national initiatives such as the WHO checklist. We should by now all be aware of our duty to keep our skills and knowledge up-to-date, as reflected in the need for continuous professional development/education. However, we also have a duty to address our own inherent frailty by increasing our understanding of, and learning tools to minimise, those aspects of human behaviour that contribute to medical failings.
Chief amongst these are fixation errors, a form of disorded situation awareness in which one fails to revise one’s current mental model according to the available information, instead distorting the latter so that it ‘fits’. Thus a person may fail to consider the one problem that is actually present or the solution that is required despite evidence to the contrary (‘everything but that’), he/she may persist with a single diagnosis or plan despite evidence that it is wrong (‘this and only this’); or he/she may continue to believe that there is no problem despite a worsening situation (‘everything is OK’) [18, 19].
Fixation occurs all around us and anaesthetists will be familiar with some ‘classic’ examples: the surgeon who insists that the patient is not bleeding despite clinical signs to the contrary; the anaesthetist who continuously fiddles with the oximeter probe rather than address the worsening hypoxaemia; and the ward nurse who faithfully records worsening vital signs on the observation chart as the patient slips into coma, unnoticed. Fixation can also be more subtle than this; as an example consider a sequence of (true) events that occurred in my hospital after the infusion line of a patient controlled epidural analgesia (PCEA) pump was accidentally connected to the patient’s intravenous cannula, fortunately with no adverse sequelae (Box 1).