Posted by: Airtower | 2011-12-23

Side-effects and “Human Error”

Have you ever considered possible side-effects when making decisions in a stressful situation? A while a go I read “What Really Happened Aboard Air France 447” by Jeff Wise, explaining what happened on a French air plane that crashed into the Atlantic ocean on June 1, 2009. The pilots’ actions reminded me of a German book by Dietrich Dörner called “Die Logik des Mißlingens” (English title: “The Logic of Failure”), which describes typical failures in human thinking, some of which I noticed in the pilots’ actions. Being aware of these common problems helps preventing them, so I thought I’d write about it. I’m neither a pilot nor an aviation engineer, so I can’t really comment of the technical side of events, but the human behavior is very interesting.

Well, what happened?

I’m going to give a simplified summary of the events as far as it’s necessary to understand my points, using excerpts from the original article, which I encourage you to read for more details.

The AF447 flew into a thunderstorm area over the mid-Atlantic, and warm weather prevented the plane from flying over instead of through the storm. Inside the clouds speed measurement equipment iced over, so the autopilot shut down due to lack of data. At this point the captain was taking a rest (not unusual on a long flight), so his co-pilots Bonin and Robert were at the controls.

Note, however, that the plane has suffered no mechanical malfunction. Aside from the loss of airspeed indication, everything is working fine. […] But neither Bonin nor Roberts has ever received training in how to deal with an unreliable airspeed indicator at cruise altitude, or in flying the airplane by hand under such conditions.

Of course this is a very difficult situation, and another articles says we should be careful not to blame the pilots too quickly, but still the incorrect response to the technical problems is what turned the difficult situation into a disaster.

Bonin reacts irrationally. He pulls back on the side stick to put the airplane into a steep climb, despite having recently discussed the fact that the plane could not safely ascend due to the unusually high external temperature.

This is where the real problem started. The plane became unstable and stalled.

Then the stall warning sounds. […] A stall is a potentially dangerous situation that can result from flying too slowly. At a critical speed, a wing suddenly becomes much less effective at generating lift, and a plane can plunge precipitously. All pilots are trained to push the controls forward when they’re at risk of a stall so the plane will dive and gain speed.

However, both pilots ignored the alarm and didn’t even talk about it, possibly because they thought automatic safety systems would prevent stalling. Under normal circumstances this would have been correct, but like the autopilot these safety systems had switched off due to lack of sensor data.

Instead of reconsidering his commands, Bonin kept pulling his control stick back. He kept doing this almost the entire time until the crash, even when his colleague took over. The control computer uses an average when getting conflicting inputs, so Robert was unable to lower the plane’s nose. Soon the co-pilots called for the captain. When he arrived, he sat down behind his co-pilots, probably because changing seating during a critical situation would have been dangerous. But this also meant he couldn’t see what Bonin was doing, and apparently Robert didn’t notice either. Only a few seconds before the crash, Bonin said what he’s doing, and the captain immediately saw the problem:

02:13:40 (Bonin) Mais je suis à fond à cabrer depuis tout à l’heure!
But I’ve had the stick back the whole time!

At last, Bonin tells the others the crucial fact whose import [sic!] he has so grievously failed to understand himself.

02:13:42 (Captain) Non, non, non… Ne remonte pas… non, non.
No, no, no… Don’t climb… no, no.

Bonin let go and Robert took over the controls, but the plane was still in fast descend and it was difficult to build up enough speed and stabilize. Soon another alarm warned that they were quickly approaching the surface, and Bonin made the final mistake.

At any rate, without warning his colleagues, Bonin once again takes back the controls and pulls his side stick all the way back.

02:14:23 (Robert) Putain, on va taper… C’est pas vrai!
Damn it, we’re going to crash… This can’t be happening!

02:14:25 (Bonin) Mais qu’est-ce que se passe?
But what’s happening?

02:14:27 (Captain) 10 degrès d’assiette…
Ten degrees of pitch…

Exactly 1.4 seconds later, the cockpit voice recorder stops.

Error patterns

I see three of the typical problems mentioned by Dörner here:

  1. Ignoring the consequences of one’s actions
  2. Ignoring safety rules
  3. Failure to cooperate

Ignoring consequences

For unknown reasons, Bonin didn’t understand that his pulling the plane’s nose up was causing the loss of speed and consequently loss of height, instead of the intended climb. Even without seeing this connection, the lack of the intended result should have prompted him to reconsider.

Ignoring safety rules

Ignoring safety mechanisms plays a role in many accidents big and small, up to things like the Chernobyl reactor disaster. Sometimes it is done out of convenience and becomes a bad habit when it doesn’t immediately cause trouble, on AF447 it seems like the pilots thought the stall alarm was wrong. Of course safety systems can fail, too, but from the transcript it seems to me they didn’t really consider the possibility that the alarm was right.

Failure to cooperate

Bonin didn’t tell his colleagues that he was pulling back his control stick for far too long, and even after the captain told him to stop he later ignored the order, apparently out of panic.

Lesson learned

Sadly there is no magic trick to change one’s brain to not make this kind of typical human mistakes, but as Dörner writes awareness and (if applicable) training can help preventing them.

So first of, be aware of these problems. Don’t ignore safety rules, and try to keep your eyes open for unintended consequences – especially if it looks like the intended consequences don’t occur. Whenever possible you should also think about possible side-effects before acting. Ignoring unintended consequences is also prevalent in slowly developing fields, like politics, where they can take decades to occur.

Secondly, in some fields training helps a lot. In the article on AF447 Wise writes that other pilots were able to successfully continue the flight in a simulator after the loss of speed data. While not certain, I think it is a reasonable assumption that practicing responses to this kind of condition would have helped the pilots avoid the mistakes they made.

I know this sound pretty theoretical, but it is what I try to do.

Advertisements

Responses

  1. Thank you for this very interesting article and informing about “The Logic of Failure”. I too, think that training for unusual conditions can help preventing such accidents. I`ve already ordered the book and looking forward to read it.
    By the way, happy new years from Sendai.

    • Thank you, and have fun reading! 🙂

  2. Hey sry I didn’t read the article before ^^

    So here my comment:

    that’s really interesting, although only partly new 😄
    I definatly agree preparing for possible although unlikely situations helps, like my sister does: She prepares in her hear answers for stuipid guy trying to talk to her, thus she always has smart answers 😉 Ok, not as a big example which might cost lifes as the flight but still similar priciple (for me: I never expect to be spoken to, thus have been totally out of it the two times someone talked to me and didn’t know what to say >.>)
    But I also guess there are probably situation thos designing the training don’t forsee so there are no preperatory tests for them. So I think even more important is realy teaching about these mistakes and make people aware of them, because if they know the possibility to remember them in a crucial situation will rise enomously.
    Btw I think one of the most important facors why these mistakes happen: Panic. Many tend not to think when beeing in an emergency situation and probably there are people who even with training can’t do so ^^* In a simulation it’s much easier to not panick, knowing nothing serious can happen, it’s a very different matter when things actually happen. So even training so no guarantie these mistakes won’t happe (although the chance rises)

    ok so much from me ^^

    c you

    • Now that’s an surprising example. 😉
      I agree that it’s much easier to stay calm in a simulation or emergency drill because you know nothing serious will happen if you fail, but just that is the point. With enough practice you’ll react correctly even while in panic. The students described in this article provide a great example: http://www.japantimes.co.jp/text/nn20110604f1.html


Leave a Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: