Ascribing blame in complicated systems
ThoughtsPetter’s latest Mentour Pilot episode on the crash of Sriwijaya flight 182 was an eye-opening and tragic example of confirmation bias, emphasis added:
He has, for whatever reason, not been looking down at his instruments for the last twenty seconds or so [which would have shown asymmetric thrust in a faulty system]. The aircraft is inside cloud at this point, meaning that he likely has no outside horizon to guide him.
His mental model would have been that the aircraft was in a right bank, turning towards the heading that he had selected. He now suddenly hears a BANK ANGLE warning. And when he looks down, the first thing he likely sees is the control wheel displaced to the far right, confirming his mental model.
So what would you do, if you hear a bank angle warning, and see the control wheel to the right? Yes, the initial reaction the captain did was to disconnect the autopilot, and move the control wheel to the left.
Since the aircraft was actually already in a hard left roll at this point, this caused the bank to quickly increase.
I appreciate documentary creators who take the time to understand people’s mental model and thought processes in circumstances like this. It’s not an excuse, but an attempt to understand how someone operating under duress reached a fateful decision. It’s only doing this that will arm us with the knowledge to design our systems to be more robust, and provide better training.
The same goes for IT. I’m baffled how often discussions about accidents, exploits, or other technical issues focus myopically on operators, and not the circumstances and system design that lead to those fateful decisions being taken.
“People claim to have been taken.” “Taken?” ~ Taken
You can provide someone with all the documentation and training in the world, but an ill-conceived or opaque system that doesn’t account for humans is one destined for failure. This is also why accessibility is a universal concern, not something you can bolt on after the fact.
As my requirements modelling professor Bernard Wong used to say, it’s PEBKAC until it’s not.