Ralph Chadkirk is a paramedic for an NHS Ambulance Trust. You can follow him on Twitter @rchadkirk
It is inevitable that, at some point in their career, a healthcare professional will make a mistake. We are all human, and we all have the same fallibilities: tiredness, hunger, pressure and cognitive biases. In short, “to err is human”.
We hope that if we do make a mistake then it does not adversely impact patient care to the extent that damage occurs. But this isn’t always the case. Mistakes can be catastrophic, resulting in the avoidable death of a patient, or life-changing disability. In 2015-16 the NHS Litigation Authority reported that it had received over 10,000 clinical negligence claims costing a significant amount of money to resolve.
How we deal with errors is important. We need to know why mistakes occur so that we can prevent them from happening again. Let’s consider an appropriately anonymised but real example. In the heat of the moment with a sick patient at three o’clock in the morning, a paramedic picks up some fluid to give to a patient. The paramedic wants to give saline, but he accidentally picked up a bag of glucose. The result would be a profound hyperglycaemia and coma. If the reaction to this error is to sack the paramedic, perhaps proceed to the regulator for action, a civil claim for damages or even a conviction, then where is the compulsion to report it? It might appear easier to the clinician involved to hide the mistake...
We could look at this a different way. Perhaps if the response to the confusion over which bag of fluid to give was a constructive process, involving but not punishing the clinician involved, then we might discover that saline and glucose look very similar, however for the same cost we could purchase saline in a bottle instead to make it physically different from glucose. We might also find out that the standard packing list for the paramedic’s response bag had saline and glucose placed in the same pouch. We could put them in different pouches so that the chance of confusion was further reduced. Both of these easy, simple, system changes could eliminate this error entirely. But we’d only find that out if the mistake were reported.
The answer to this is a Just Culture. Embraced by the aviation industry, this organisation-wide culture change does not protect employees entirely from the effects of their mistakes – nobody can be above the law, and incompetence in a role needs to be effectively managed – but it creates an environment where reporting mistakes is encouraged. There needs to be mutual trust between the employees, the organisation, external regulators and the judiciary in order for a Just Culture to succeed.
If you’re a clinician reading this then be honest if you make a mistake. Report it to your employer, tell others involved in the patient’s care, and fulfil your Duty of Candour responsibilities and tell the patient. If you’re a manager reading this then consider carefully whether punitive action will improve safety in your organisation, or whether it will lead to mistakes being covered up. Remember, a constructive process may lead you to change your system to prevent the same error ever occurring again.