You’re a good pilot
An okay pilot at least. You’ve got a few years gliding experience now, with your logbook filling quickly with fun memorable flights, lots of flying hours total experience. Early days of cautious apprehension are behind you, yet you keep an eye out for risks that might emerge.
You’ve studied all these human factors and threat and error management topics, you know what it means, and you are attentive to lookout and good airmanship. It’s been a long time since anyone chipped you about your flying standards and decisions.
Currency and recency, they’re not too bad, yet you would prefer to have flown more if not for life pressures and some recent bad weather. Your last flight review was benign, no major concerns were raised, with a good outcome. You feel fit and well, rested, in control of yourself and your glider. All good, eh?
Sound familiar?
What expectations might you have when you next fly? What level of risk, what types of risks, might you be unwittingly accepting? To what extent might you have complacency filters operating?
So, you make plans for your next flight. You expect it will go well. In fact, you are sure that you would not even take off if there were undue risks. No way, you say.
You find yourself surrounded by friends and peers at the flight line. Airworthy and prepped. All good, they say. They encourage you to get airborne.
So, what kinds of pressures are you under now? How might these affect your decisions?
Perhaps this flight goes well, almost pristine, and perhaps not.
We know this: nobody, ever, sets off to fly with the intention of coming to harm, or harming others. We also know that as humans, we are prone to making errors, usually with no or very low consequence, but occasionally worse…
Now consider this
When you last had a scare inflight, to what extent was that scary event shaped by the decisions and preparations you made, shaped by your biases and expectations?
Was confirmation or optimism bias a factor? ‘Just as I always thought…I thought I could do it, just like every time before…’
How about plan continuation or sunk cost bias a factor? ‘Well, we got this far…. ‘Nearly there…’ ‘Worked so well so far and seems okay, just a bit more…’
Was expectation bias an issue, seeing what you believed was there, rather than the stark reality. ‘Believing is seeing…’ ‘I saw what I have always seen before…’ ‘Of course it was working okay…’
To what extent did your training and priority management have to kick in to resolve the scary event? Was the outcome good, or bad, or in between?
We hear tales in hangar and clubroom chats, sometimes laughing, sometimes shaking our heads in dismay at what we hear, sometimes sad at the outcomes and impacts on friends and colleagues. Sometimes we grieve. Sometimes we are reminded of near misses, massive learning experiences, even taller tales to share with others.
What about judgement and hindsight? Occasionally tales are told of accidents and serious incidents, where we automatically rush to judgement. ‘Pilot error’, we hear, then ‘I would never do that!’, or ‘how dumb is that?’ With blamestorming filters applied, it’s tempting to skip past the key questions, such as what led the pilot to those circumstances? What preconditions might have applied? What social or peer or club cultural pressures might they have faced? What errors did others make? Who stepped in to modify their risk appetite, eroding safety margins, or willingness to push themselves harder? What interventions were tried, what might have worked?
20/20 hindsight is a wonderful thing, bestowing instant wisdom, yes? No! Hindsight bias can lead people to avoid asking the difficult questions about contributing cultural and organisational factors, or even worse, their own susceptibility to making these same errors!
It’s a natural human tendency to want to understand why things go wrong, then attribute causal factors with a dollop of blame added. False binaries often apply; gliding safety involves judgements and decisions in an environment with lots of grey, not just black and white. Rules don’t fix all errors, nor automatic resort to new (non-standard) procedures. It’s easy to blame others, harder to blame ourselves or our friends.
Understanding better ways?
We never set out to come to harm, or harm others. We make errors of varying consequence. We are all susceptible to biases that may drive sequences of decisions and actions with undesirable consequences. Rather than blame, we owe it to all our friends and colleagues to make the best of our safety insights, including some less comfortable issues. We must question ourselves, understand our biases, better preventive and planning practices.
So, let’s be kind and respectful to each other, learn more, apply those lessons in a precautionary sense.
Next time you plan and prepare for flying, perhaps a different mindset might be useful?
Susceptible to confirmation bias?
Drew McKinnie
Safety Manager