Who’s Flying the Plane

I mentioned in an earlier post my interest in plane crashes. I had been toying with a presentation based on this concept for quite awhile.

A little over a month ago, at the local SQL Server User group here in Albany I offered to present for the February meeting. I gave them a choice of topics: A talk on Entity Framework and how its defaults can be bad for performance and a talk on plane crashes and what IT can learn from them.  They chose the latter. I guess plane crashes are more exciting than a dry talk on EF.

In any event, the core of the presentation is based on the two plane crashes mentioned in the earlier post, Eastern Airlines Flight 401, the L-1011 crash in Florida in 1972 and US Airways Flight 1549, the Miracle on the Hudson in 2009.

I don’t want to reproduce the entire talk here (in part because I’m hoping to present it elsewhere) but I want to highlight one slide:

Flight 401 vs 1549

  • Flight 401 – Perfectly good aircraft
  • Flight 1549 –About as bad as it gets
  • Flight 401 – 101 Fatalities/75 Survivors
  • Flight 1549 – 0 Fatalities

Flight 401 had a bad front nosegear landing light and crashed.

Flight 1549 had two non-functional engines and everyone got off safely.

The difference, was good communications, planning, and a focus at all times on who was actually flying the airplane.

This about this the next time you’re in a crisis.  Are you communicating well? How is your planning, and is someone actually focused on making sure things don’t get worse because you’re focusing on the wrong problem.  I touch upon that here when I talk about driving.

The moral: always make sure someone is “flying the plane”.

Avoiding Mistakes

People often treat mistakes as unavoidable. Or sometimes people think mistakes are made because the person making them is unfamiliar with the situation or environment.

The truth is far more complex and often mistakes are not only avoidable, but they’re a direct result of the person being overly familiar with the situation or environment.

A post that came across my desk the other day discusses the Normalization of Deviance. No, I’m not talking about how references to 50 Shades of Grey are all over the place. I’m talking about how we come to accept errors as “ok” or even normal.

A classic example of this are the cases of O-ring burnthrough on flights prior to the Challenger disaster. The original specs called for no burnthrough. Any burnthrough was not acceptable. Yet once it was observed the basic attitude was that it hadn’t caused a problem so it was acceptable. At one point apparently when one was burned through approximately 33% the way, a claim was made that the O-rings had a safety factor of 3. This is a gross misapplication of the concept of a safety-factor since the O-rings were specced to have zero burn-through. By moving the goalposts, they permitted further launches to occur and burnthroughs to continue to until 51L and seven lives were lost. This was a huge management error. In this case the mistake was to ignore the original rule and essentially rewrite it without adequate review. The engineers had become used to the new norm, despite it being wrong.

In the example given in the first link above, a different form of deviance occurred. This was a social deviance that apparently made the lack of use of checklists acceptable.

In the crash, a large red warning device was completely ignored. One would think this was the mistake that caused the crash. However it’s really secondary to the original problem. The original problem is that checklists were developed precisely because humans CAN fail to notice large red warning devices. By not performing the checklist a mistake was missed and lives lost. Everything else is sort of fluff.

For pilots, take-offs become a routine procedure. So routine they begin to make simple mistakes.  Had this been their first time flying or even their first time in that particular model of aircraft they most certainly would have been paying attention. This is why checklists exist in cases like this, to eliminate the mistakes routine can introduce. Either pilot should have questioned the lack of the take-off checklists and insisted on their use.

They didn’t and people died.

 

 

Post hoc ergo propter hoc

One of my favorite shows is The West Wing and there is an episode of the same name as this post. Unfortunately for you, Aaron Sorkin is a better writer than I.

That said, this concept, “After it, therefore because of it” is a common mistake many of us make when forming theories. It’s related to the concept that correlation is not causation.

I was reminded of this the other night when another phrase entered my mind: “Rain Follows The Plow”. This was a hopeful theory in the 19th century that as settlers settled past the 100th Meridian, the rain would follow where they plowed. Simply put, by farming the land, rainfall would increase.

The theory sounds a bit perverse until one considers that for awhile, increased rainfall did seem to increase as the more land came under the plow. So, there was some basis for the idea at first. The correlation seemed to match. However, this just ended up being a short-term climate change.

Unfortunately the theory was also a product of the idea that humans were the center of creation. As the subsequent Dust Bowl and other issues showed however, this theory was, (excuse the bad pun) all wet.

Sometimes correlation is not causation and we should not let our all too human biases influence our theories.

Fortunately, properly done, science is eventually self-correcting. Scientists make mistakes, but over time, the winnowing process eliminates them.  The idea of scientific racism was once extremely popular, but over time has clearly been shown to be false.  The idea of an ether was shown to be false.

Meanwhile, other theories have continued to hold up to intense scrutiny. As weird as quantum mechanics appears to be, evidence continues to mount that much of the current theory is in fact correct. When scientists discover particles that travel faster than light the default assumption continues to be (and so far correctly) that there is an error in the experiment.

Not much of a moral here other than just because the rooster crows when the sun rises, don’t mistake the crowing for the cause of the sunrise.