Human Error… Often talked about, written about, and used to close out accident and incident investigations. What is meant by human error? I have learned over the past few days that my perception contrasts to that used by others working in the safety field, and I wanted to scope this out a bit more, share my thoughts, and hopefully generate some discussion about same…
According to the Collins Dictionary, human error is: “when someone makes a mistake which causes an accident or causes something bad to happen.” The definitions are many and varied, Stephen Shorrock has a nice ‘tongue in cheek’ definition he has compiled from the many versions out there: “Someone did (or did not do) something that they were not (or were) supposed to do according to someone.”
Over my years of study into all things safety and complexity, I tend to raise an eyebrow and turn a very cynical gaze when I hear of the term ‘human error’ being used to describe why things went wrong. Attending a weeklong Masterclass in Human Factors and Safety with Prof Sidney Dekker & Robert J de Boer in 2015 challenged what I had always been taught to be human error, and in no small way shaped my current stance on ‘human error’. From experience, human error is seen to be attached to blame (even in so-called ‘no blame’ cultures), and the connotation is that some person made a mistake. This can be very frustrating for frontline workers who are working with less than ideal systems which are ultimately setting them up for failure.
Let me tell you a story about ‘human error’… A nurse I know (lets call her Ann) recently administered what turned out to be the wrong drug to a patient in her care. This is an undesirable outcome, which could have had devastating effects for the patient, but thankfully the patient was not harmed in this instance. After the incident investigation the cause of the undesirable outcome was deemed to be human error, and Ann was sent for retraining with a pharmacist on proper procedure and protocol to follow when administering drugs to patients. Some might agree with this course of action, and believe it will prevent a reoccurrence. I’ll give you some more details on the same incident.
On the shift in question, Ann came on night duty, and looked at the handwritten prescription in the patients file, which happened to be written in illegible handwriting by a senior consultant. She questioned the content of the prescription, and asked the nurse going off duty at handover what drug was being alluded to (lets call it drug abc123). “Is this drug abc123?”. “Yes”, said the other nurse going off duty, “it is drug abc123”. Before administering drugs, nurses are required to have another nurse do a double sign-off as part of the drug administration procedure. Ann asked the 2nd nurse to double check that she was administering the correct drug (abc123) to the patient, she was given the nod. Off she went and administered the drug, which we now know transpires to be the wrong drug. All three nurses struggled to identify the correct drug due to the difficulty they had trying to read an illegible prescription.
If we want to prevent the administration of the incorrect drugs from happening again, will labelling Ann with ‘human error’ and sending her for retraining on drug administration procedure deliver on this? Is it possible that another nurse might find him/herself in the same situation as Ann in the future? Is ‘human error’ an appropriate label to place on Ann in this case? Did she make a mistake? Or did she do what she does every day - try to deal with multiple demands in a far from perfect system, and look after her patients? Would looking a the bigger picture, the latent conditions that are in place, setting Ann and her fellow nurses up for failure, be a better approach in trying to prevent the same happening in the future? Might it be worth moving with technical progress and going digital with prescriptions?
Ann cried herself to sleep after coming off nights and was upset for some days afterwards, at what she perceived to be the injustice of the system which set her up for failure, yet blamed her for the outcome.
I would like to ask any safety person reading this to ask themselves when next asked to investigate an incident or accident to think long and hard before placing the ‘human error’ label on someone, without genuinely delving a bit deeper and probing to make sense of what led to this undesirable outcome.
I would suggest that an appropriate add on to Alexander Pope’s famous quote “to err is human” might be “to err is human, but to be human is not necessarily to err”. By this I mean that don’t always jump to the conclusion that the person at the sharp end (nurse administering the drug/pilot driving the aeroplane/operator carrying out high risk activities) has made a mistake. Look instead further upstream within the system and be curious as to how your organisation might be setting these people up for failure by not taking into account the fact that they are human, and only capable of so much (that is not a flaw by the way, just a fact of being human). If we exert too much pressure, or expect perfect delivery of service and duties from our employees when they are working with less than ideal tools and systems, it is reasonably foreseeable that the human in the system may struggle at times and undesirable outcomes may come to pass. Rather than have disengagement, disillusionment, and contempt from employees working in what they may well perceive to be a bureaucratic and punitive-style system which labels them with ‘human error’, wouldn’t it be far more value-added to the business overall to engage with people, make it safe for them to recount exactly what led to the incident, delve further upstream into the system, and learn from this incident to genuinely try and prevent a reoccurrence in the future?
Last week I travelled to the UK to spend a couple of days with some Paradigm Human Performance colleagues. Teresa Swinton (director) went through an incident investigation she undertook recently for a client. Firstly she showed me the initial investigation done in-house, then the investigation she undertook into the same incident. The two investigations into the same incident were poles apart. The first investigation report, taking the safety I approach, led to some contractors from a construction site being dismissed for their involvement in this incident. It looked very much so at the sharp end and focused blame here.
Teresa’s investigation report, taking the safety II approach, looked further upstream within the organisation from where the incident occurred. It looked at influencing factors far within the organisation that directly & indirectly influenced the outcome of this incident. Management weren’t aware of it, but some of their reward systems were directly influencing how work was done on site, and leading to undesirable outcomes. Teresa’s recommendations resulted in the dismissed contractors being reinstated, and for management to take a fundamentally different approach to how they view and react to incidents that occur. Note - there was no mention of ‘human error’ in this report.
One of the quotes Teresa used during a Paradigm training day was ‘beware of the shadow you cast as a leader’. If you genuinely don’t want people getting hurt in your organisation, try and look holistically at the system rather than just at those in the sharp end, and try and be mindful with your use of words (especially ‘human error’). Walk your talk, make sure your video is aligned with your audio, your shadow won’t lie…