Medico-Legal - Can airline pilots teach GPs to manage risk?

Captain Phil Higton looks, through a pilot's eyes, at ways of reducing levels of risk in healthcare.

Photograph: istock
Photograph: istock

'Catastrophic failure': a phrase that sends a chill down the spine of two groups of professionals in particular - GPs and commercial airline pilots.

They have more in common than you might think. Both work in environments that are unforgiving of error. Outcomes can be catastrophic or fatal. Both GPs and pilots rely on teamwork.

Risk management
The aviation industry manages risk significantly better than healthcare, and may have much to teach healthcare professionals about the human factor in risk management.

Terema, a training organisation run by doctors and former British Airways pilots, helps to pass on that body of knowledge. At its heart is a single question: what can we in healthcare learn from our counterparts in the aviation industry?

The chance of an individual pilot or GP being involved in a catastrophic event is thankfully rare. Herein lies the tyranny of small numbers. If catastrophe is your threshold, you are 'safe' right up to the point of failure.

Being sensitive to events you might perceive as lower risk has two benefits. First, it gives a sense of the margin of safety, and second, it encourages proactive rather than reactive risk management.

My analogy is with peripheral neuropathy. This provides no early warning and a patient may suffer significant injury before the risk of harm is recognised.

Are 12 significant event analyses per year enough to prove that your risks are being managed and that your patients are safe?

In the harsh and often public world of an inquiry into catastrophic failure, a whole range of risks becomes apparent.

Disaster
Aviation has a good, but not perfect, safety record. Accident inquiries invariably discover more than one root cause of any particular safety failure. There are probably half a dozen major contributors (communication failures first among them) to all accidents.

What causes these and other risks to conspire and result in catastrophe may be chance, but eliminating any of them stacks the odds in our favour.

The worst UK air disaster until Lockerbie occurred in June 1972, when a British European Airways Trident 'Papa India' crashed within minutes of taking off from Heathrow, killing all 118 people on board.

The disaster resulted from the high lift devices on the wing being retracted and the aircraft entering a 'deep stall', from which recovery was all but impossible. The inquiry revealed that Trident aircraft had stalled in this way at least three times before but the pilots concerned had regained control, so reports were not felt to be warranted.

Human factors
A complex web of circumstances contributed to the 1972 crash. The aircraft was serviceable right up to the time that the high lift devices were retracted.

The flight data recorder tells us what happened to the flying controls and the aircraft flight path but because cockpit voice recorders were not fitted in 1972 we can only speculate about the crew's activities. What we do know is that the captain had had a furious argument minutes before boarding the aircraft.

Since this accident, we train pilots in human factors and managing reduced capacity or overload. We use the acronym HALT (hungry, angry, late or tired) as a prompt to stop and think about the impact of these factors on performance. Only then should pilots proceed with caution.

Significant events
Which events are significant? The industrial safety pioneer Herbert William Heinrich illustrated the rarity of catastrophe with a pyramid. For every single serious accident at the tip, there are 29 minor accidents further down, and 300 events at the base, where no injuries result.

Effective risk management requires the analysis of minor human and system events that could contribute to a catastrophe. The US National Coordinating Council for Medication Error Reporting and Prevention has a scale for harm ranging from A ('circumstances or events that have the capacity to cause error') to I ('patient death').

Significant event analyses often focus on the range E ('temporary harm to the patient') to I. With the will and appropriate systems we can also learn much from categories A, B, C and D.

Most data at these lower levels seem trivial. Every day both pilots and GPs assess, make decisions, share, withhold or ignore information, take action or choose not to. Only when we join the dots of data from minor events can we reveal risks.

For instance, one Derbyshire GP practice noticed that some prescriptions were not ready for patients. A software error was identified that was also discovered to affect the trigger for follow-up appointments - a more significant risk.

In aviation we talk about 'looking for trouble' in that confronting risk is better than dealing with the aftermath of failure.

We work hard to harvest minor events and act on them. We identify HALT factors and pause. We involve all staff in managing risk and communicating openly with us about risk. After that - we rely on luck.

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Register

Already registered?

Sign in