SEO Title
Checklist Discipline: Avoiding the Simple Stupid Stuff that Kills
Subtitle
Investigators and research have pointed to numerous reasons for lack of compliance with checklist use, but such an approach proves fatal.
Subject Area
Channel
Teaser Text
Investigators and research have pointed to numerous reasons for lack of compliance with checklist use, but such an approach proves fatal.
Content Body

Often the “stupid simple stuff” ends up killing people in aviation. Forgetting to disengage a gust lock, turn on the probe heat, or properly set the flaps or trim prior to takeoff have all resulted in fatal accidents. Checklists—one of the most basic tools in a pilot’s toolkit—are designed to overcome limitations of pilot memory and ensure that action items are completed in sequence without omission. In each of these accidents, checklists were either disregarded or involved a more complex error caused by a human factors issue.

Normal checklists enhance flight safety and enable the pilot(s) to confirm safety critical systems and controls are correctly and consistently configured for a phase of flight. Asaf Degani, a former researcher at NASA Ames Research Center, co-authored a study on the Human Factors of Flight Deck Checklists and identified the primary purpose of the normal checklist: “The major function of the checklist is to ensure that the crew will properly configure the plane for flight and maintain this level of quality throughout the flight and in every flight.”

Degani, now a technical fellow at General Motors' Research and Development Center, noted checklist use was particularly important in takeoff, approach, and landing. He said, “Although these segments comprise only 27 percent of an average flight duration, they account for 76 percent of hull loss accidents.”

Checklists, as a topic of discussion, went mainstream in 2009 when Atul Gawande published his book The Checklist Manifesto: How To Get Things Right. Gawande offered a convincing argument for adopting the use of a checklist in modern life and business using aviation as a backdrop. The premise of this argument identified why humans fail and why they need checklists. Gawande concluded that we fail for two reasons: either ignorance or ineptitude.

Ignorance is explained as “we may err because science has given us only a partial understanding of the world and how it works”—or—we simply do not have all the information available to us. As an example, the use of idle reverse during taxi may be prohibited on a specific aircraft type, but if there is no written guidance available then there is no way for the pilot to know that they erred.

Ineptitude is defined as “an instance where knowledge exists, yet we fail to apply it correctly.” In this case, the flaps may have been mis-set for takeoff and a checklist was available to trap the error, but the crew intentionally chooses not to use it.

“Failures of ignorance we can forgive,” according to Gawande. “If the knowledge of the best thing to do in a given situation does not exist, we are happy to have people simply make their best effort.” Conversely, he says, “But if knowledge exists and is not applied correctly, it is difficult not to be infuriated.”

Gawande added, “Experts need checklists; think of them as written processes to guide them through the key steps in any complex procedure.” In aviation, pilots have many resources such as checklists, quick reference handbooks, aircraft flight manuals, and flight ops manuals to build knowledge and have a more complete understanding of their world or operational environment.

Tale of Two Tails—the Origin of the Checklist

In 1934, the U.S. Army Air Corps held a competition to replace the service’s vulnerable twin-engine Martin Aircraft B-10 bomber. Boeing began to campaign its Model 299—a revolutionary aircraft for its time—a four-engine bomber with long range and a “self-defense” system that included multiple gun turrets. The Model 299 was a leap forward in bombers for the time and was heavily favored to win the competition.

During a “fly off” demonstration in October 1935, the aircraft departed Wright Field near Dayton, Ohio with a highly experienced test pilot at the controls. Shortly after takeoff, witnesses reported “the aircraft broke ground in a tail low attitude” and as the speed increased “the bomber’s nose went much higher than normal.” Accordingly, the aircraft reached a maximum height of 300 feet, stalled, and crashed into a field, left wing first, and burst into flames. Fortunately, all four crew members survived the crash. 

Only the tail section of the aircraft was recovered from the crash and fire. With this piece of evidence, investigators discovered the cause of the accident—an internal control lock that immobilized the elevator and rudder. It was determined that it was improbable for any pilot to successfully take off with the control lock installed.

Following the accident, Boeing could not complete the competition and an initial order for the aircraft was canceled by the Army Air Corps. Because the Army still favored Boeing’s bomber, the service found a legal loophole to purchase 13 YB-17s (the new designation for the Model 299) for further evaluation.

Boeing and the Army Air Corps felt the YB-17 was too big and complex for any pilot to fly safely. To avoid another accident, a checklist was developed for takeoff, flight, before landing, landing, and after landing. This checklist proved to be successful, and the aircraft went on to become the B-17 Flying Fortress with more than 12,000 examples delivered during World War II.

The GIV’s gust lock control is designed to prevent application of full power, and it should be disengaged after starting the engines.
The GIV’s gust lock control is designed to prevent application of full power, and it should be disengaged after starting the engines.

Tale of Two Tails—a Renewed Emphasis

Fast forward nearly 80 years and another accident involving a “gust lock” created a watershed moment for business aviation. In May 2014, a Gulfstream IV attempted to take off with the flight control gust locks engaged from the Bedford, Massachusetts airport. Unable to rotate, the crew attempted to reject the takeoff at a high speed at a point where the aircraft was unable to stop on the runway. The aircraft overran the paved surface of Runway 11 and collided with ground obstructions before ending up in a gulley. A post-crash fire destroyed the aircraft and killed all seven occupants, three crew members and four passengers.

Investigators were able to recover data and information from the flight data recorder (FDR), cockpit voice recorder (CVR), and quick access recorder to reconstruct not only the accident flight but to investigate past flights and actions (or inactions) by the flight crew.

Both pilots—a 45-year-old captain and a 61-year-old first officer—had been employed by the operator for many years and had flown the accident aircraft for the past seven years. The first officer (occupying the right seat for this leg) was the chief pilot and director of maintenance. 

The investigation of the GIV overrun at Hanscom Field in 2014 identified significant lack of use of checklists, flight control checks, and pre-takeoff briefings on many flights before the accident.
The investigation of the GIV overrun at Hanscom Field in 2014 identified significant lack of use of checklists, flight control checks, and pre-takeoff briefings on many flights before the accident.

Prior to departure from Bedford, once the passengers were boarded, the engines were started, and the aircraft began to taxi to Runway 11. During this time, it was noted that the CVR had “recorded minimal verbal communication between the flight crewmembers and there was no discussion or mention of checklist or takeoff planning.” The Gulfstream IV aircraft flight manual includes five checklists to be completed prior to takeoff. 

The “Starting Engines” checklist included the requirement to disengage the gust lock. Likewise, the “After Starting Engines” checklist included a flight control check—checking the full movement of the flight controls to confirm that they move freely and correctly. Checklist items as specified by the OEM, if completed, would have identified that the gust lock was still engaged. FDR data showed that the flight crew did not complete the flight control check on many of the previously recorded flights (two flight control checks on the previous 176 flights).

A contract pilot interviewed during the investigation stated that he had flown with the accident pilot for several years but said, “he habitually did not use a formal item-by-item checklist.” In the final NTSB accident report, of the top contributory factors in the accident was the “flight crew’s habitual noncompliance with checklists.”

The final report on the Bedford crash made several safety recommendations, including a call for the International Business Aviation Council (IBAC) to amend its IS-BAO auditing standards to include verification that operators comply with best practices for checklist execution and that the NBAA should work with operators to analyze existing data for non-compliance with flight control checks.

King Air Pilots and Checklists

Unfortunately, less than three years after the Bedford G-IV crash, there was another business aviation aircraft accident where checklist usage may have been a factor. In this case, in February 2017 a Beechcraft King Air B200 crashed shortly after takeoff at Essendon Airport in Victoria, Australia, killing four American passengers and an Australian pilot.

In its report, the Australian Transportation Safety Board (ATSB) noted that during the takeoff roll the aircraft began to yaw to the left shortly after rotation. As the aircraft began a shallow climb the aircraft had a substantial left side slip with minimal roll. The aircraft then began to descend and collided with buildings at a retail outlet center.

The ATSB found that the pilot did not detect that the aircraft’s rudder trim was positioned in the full-nose left position prior to takeoff. The position of the rudder, the report concluded, had a significant impact and resulted in the loss of control. 

ATSB investigators noted an increased risk due to the incorrect manufacturer’s checklist being used and an incorrect application of the aircraft’s checklists. The accident report reviewed the normal checklist for the B200, and investigators determined that the rudder trim would be checked five times prior to takeoff if the checklist was followed.

Two years later, a King Air 350 crashed at Addison Airport in Dallas, Texas, killing 10 people. The NTSB determined the probable cause of this accident to be, “the pilot’s failure to maintain aircraft control following the reduction in thrust in the left engine during takeoff…Contributing to the accident was the pilot’s failure to conduct the airplane manufacturer’s emergency procedure following the loss of power in one engine and to follow the manufacturer’s checklists during all phases of operation.”

The report determined that “given the lack of callouts for checklists on the CVR and the pilot's consistently reported history of not using a checklist, it is possible that he did not check or adjust the setting of the power lever friction locks before the accident flight.” The NTSB surmised that the loss of power on the engine may have been caused by throttle lever migration—the uncommanded movement of the throttle lever, which is a common issue with the King Air.

Why Bother?

In the final report of the Essendon King Air crash, the ATSB explored, in detail, checklist discipline. The ATSB found that various research studies provided insights into “why checklist procedures may not always be completed.” Four categories were discussed including: attitude, distractions and interruptions, expectations and perception, and time pressure.

Attitude was cited as “probably the greatest enemy of error-free, disciplined checklist use.” The study determined that a lack of motivation was the biggest hindrance to using a checklist in the way it was intended to be used.

Distractions and interruptions result in a disruption of the sequential flow of the checklist. Accordingly, this not only means that the pilot will have to memorize the location of the disruption, but it may also lead to a checklist error or omission. One technique to counter distractions and interruptions is to repeat the entire checklist (starting from the beginning) during these situations.

Expectations and perception are powerful forces that pilots must manage. A NASA study found that “when the same task is performed repetitively, such as a checklist, the process becomes automatic.” This leads the user to process the information quicker and reduces workload. The challenge is that the user may begin to “see what one is used to seeing.” An example is if the pilot is used to seeing “Flaps 5” selected, then they may expect to see Flaps 5 even though Flaps 1 or NO flaps are selected.

Time pressures may affect the accuracy of the check. As the time pressures increase, pilots may rush a checklist and the pilot may scan an item too quickly, increasing the possibility of an error.

The ATSB also looked at “why checklists sometimes fail to catch errors.” A 2010 NASA Study (Dismukes and Berman) conducted 60 observation flights on three different airlines. These observations identified close to 900 deviations with 22 percent related to checklist usage. The deviations were categorized into six types including: flow-check performed as read-do, responding without looking, checklist item omitted (or performed incorrectly or incompletely), poor timing of the checklist, checklist performed from memory, or a failure to initiate the checklist.

In aviation, it is the simple stupid stuff that kills. The problem is that pilots continue to repeat the mistakes of the past. Bedford was an “eye-opener” for business aviation, but 27 years before that tragic event was a Northwest MD-82 that crashed (killing all but one infant) because the crew failed to set the flaps for takeoff and failed to perform the taxi checklist that would have trapped that error. Nearly 22 years later, a Spanair MD-82 crashed in Madrid (killing 147 people) for almost the exact same reason. Checklists trap errors that can avoid accidents and the loss of life. It is infuriating that pilots choose not to use a tool that is designed to prevent these errors.

Expert Opinion
False
Ads Enabled
True
Used in Print
False
AIN Story ID
111
Writer(s) - Credited
Publication Date (intermediate)
AIN Publication Date
----------------------------