There are a lot of sports metaphors used to inspire our best performances, but the one I like most when it comes to fighting is from the Greatest himself:
“The fight is won or lost far away from witnesses—behind the lines, in the gym, and out there on the road, long before I dance under those lights.” – Muhammad Ali
In August, I wrote in this space: "Nobody has ever won a fistfight with the autoflight.” It might be that this piece should have come first—a discussion of how crews, both veterans and novices, can prevent an undesired aircraft state, a sort of fistfight for pilots. If you’re going to avoid going down in this kind of fight, just like the Greatest suggests, you better put in the work in advance.
When we organized the (book) Automation Airmanship: Nine Principles for Operating Glass Cockpit Aircraft, we decided that our focus would not be on the mistakes that crews had made in the last 30 years and the accidents that have resulted. Instead, we turned the spotlight on what the very best crews in the profession do to avoid these large system failures at all.
We wanted to share our insight into what makes up a kind of superpower that some pilots demonstrate: a special “knowledge reserve” that makes a few pilots perform consistently better when given the same set of in-flight conditions as their peers.
While engaged with organizations of all types and sizes, military and civil, fixed-wing and rotary, we kept seeing this quality on display at about the same rate, no matter where in the world we were being employed to devise robust flight protocols for the cockpits of new fleets.
There was—always—one or more pilots (usually at least two in any given flight department) that could explain the systems better and more thoroughly and operate with more relative ease during flight operations, regardless of the fact that they had nearly the same training and experience with the new technology as their peers.
In the end, we fit all of the qualities of this small group into the last of our Nine Principles: Logic Knowledge. It was the definitive “missing piece” in distinguishing “good” from “great” when evaluating individual pilots. It continues to be so today and might even play an even more important role as aircraft become even more reliant on digital control systems and software.
It explains how crews operating the same make and model of aircraft in similar flight conditions can experience the same system casualty and have completely opposite outcomes—one tragic, and the other so routine it barely meets reporting criteria. (We talk about this in Automation Airmanship by describing what the crews of two other nearly identical aircraft did to overcome an almost identical system failure as the ill-fated crew of Air France 447 in June of 2009, just months apart in that same year.)
Better performing crews on the advanced flight deck are those who know that they are part of what engineers and analysts have named a “tightly-coupled” system. They take every opportunity to know all they can about the technology and other forces that they control to minimize the likelihood that, when forces align in just the right way, they won’t be the source of failure. (Tight coupling is a mechanical term meaning there is no slack or buffer or give between two items. What happens in one directly affects what happens in the other, as cited in, by Charles Perrow.)
These individuals go beyond what their training has required of them, by forming detailed mental models of how aircraft systems work and interact based on their deep knowledge of the systems themselves. (I think the reason why there are normally at least two pilots in any organization that demonstrate this extra quality is that when someone discovers an insight into how a system works, part of their learning is to explain it to others as simply as they can, and this usually takes at least two people.)
In Working Minds: A Practitioner’s Guide to Cognitive Task Analysis, one prominent group of human factors researchers describes in the clearest terms what the most modern aircraft cockpits demand of each pilot:
“Flight management systems require pilots to engage in sensemaking to fit the FMS analysis with the instrument data. Pilots depend heavily on their mental models of the logic driving the FMS.”
This is not new news to most pilots—even if training organizations spend increasingly less time on explaining and demonstrating system logic than they do on the performance of rote procedures that bring out of the technology the desired flight path control for the respective phase of flight that the crew and aircraft are in at the time.
It leaves a performance gap that is present in nearly every flight department—between what the very best are doing and what everyone else accepts as good enough.
But knowing just a few of these skills could enable every pilot to perform at an elite level. Our challenge was to extract as much of the visible evidence of this skill from our observations as we could and explain it in clear and certain terms that every pilot could understand so that they could each reach the next level of personal airmanship.
Here are some of the things these pilots know beyond what they learned in transition training:
1. The basic laws and phase of flight their aircraft is designed around;
2. Altitude level-off logic (during climb, cruise, descent, approach, and go-around);
3. Descent logic (from cruise to alert—or decision height or altitude);
4. Autopilot and autothrottle connect and disconnect logic;
5. The indirect mode-change logic that results from delegating tasks to the automation;
6. Operator authority limits that are intended to maintain the aircraft flight path within design maneuvering limits;
7. Feedback mechanisms (color changes, word changes, etc.) that display mode states.
There are more, but pilots who are determined to bring their knowledge to the next level can start here to have the most immediate impact on their personal performance.
A Personal Case Study. On a recent flight leg from the U.S. to a busy Asian destination, my crew and I were in our initial descent as cleared by the controller—still in RVSM airspace—with a crossing restriction ahead issued by ATC that we had just programmed into the FMS. The autoflight and speed control were coupled to the autopilot, per normal operations.
We were arriving in the terminal area at the end of an eight-hour leg, at the airport’s peak arrival period; the controller was doing an excellent job of managing her workload, and we were in compliance with our clearance. After an “expedite descent” instruction from ATC, our TCAS display showed proximate traffic ahead and to our right, converging and also descending, probably heading for the same waypoint we had in our FMS.
In this complex, fast-moving, and tightly coupled system, we each knew without saying it that whatever we did was going to factor in the outcome the controller was seeking—to deconflict traffic while keeping the arrivals in sequence and above all avoid a loss of separation. Seconds after we had increased our descent to around 4,000 feet per minute by extending full speed brakes, with an altitude capture mode indication on our FMA (due to our high rate of descent), we were issued an immediate level-off by ATC, at an altitude 2,000 feet above the FMS target.
Unable to communicate with the converging traffic to revise their clearance, the controller did the next best thing and issued an urgent instruction to an aircraft she was in communication with (us). Our challenge was to execute an immediate level-off that was in conflict with the commanded flight mode (from the previously programmed constraint, already in capture mode) and do it within the next few seconds. I was flying—so I can say exactly what the pilot f
lying was thinking: “Here’s a conflict between the commanded flight guidance and our clearance, and the risk to the ATC system is real. The best solution is the smooth disconnection of the autoflight and autothrottle, retraction of the speed brakes, followed by an immediate manually flown level-off from a high rate of descent, and then a reconfiguration of the FMS and flight guidance to comply with the new clearance, and ultimately, re-connection of the autopilot and autothrottle.”
It all went off like clockwork, but the mental representation of what the aircraft autoflight was commanding versus what we wanted and needed was the key to a smooth transition to another flight mode without compromising aircraft or ATC limits. The system worked as planned, and our aircraft passed safely over the conflict aircraft without so much as a traffic alert from the TCAS.
ATC thanked us for our assistance and switched us to the next sector as though it was all in a day’s work. Which it was, right? In our debrief just before deplaning, each member of my crew shared their perceptions of the situation and their takeaways, which when taken together demonstrate just how many systems were interacting together, and how important it is to know the logic of the systems we control so that we could play our part.
In short, we avoided a fistfight with the autoflight because we knew how it was configured (the airplane’s systems), how that was in opposition to what the bigger system required (the airspace system), and exactly the steps that were required to resolve the conflict.
The safety industry continues to find new ways to prove that the human remains the “weakest link” in the complex, tightly coupled system we all work in. It’s up to us as individuals to take every possible step to reduce the likelihood that we will be the source of failure on any given day when, unexpected and unwarned, forces align to challenge the limits of the tightly coupled system we live in and the knowledge that it depends on.
Chris Lutat is managing partner of Convergent Performance, a B777 captain, and co-author of “Automation Airmanship: Nine Principles for Operating Glass Cockpit Aircraft.”
The opinions expressed in this column are those of the author and not necessarily endorsed by AIN Media Group.