SEO Title
AINsight: Single-pilot Flight Operations and Lemon Juice
Subtitle
Can psychological discoveries improve flight safety?
Subject Area
Channel
Teaser Text
A pilot with a false assessment of their abilities may take off into a thunderstorm or attempt a hazardous circling maneuver that leads to a fatal accident.
Content Body

Psychology researchers have found that some of the dumbest criminals and the cockiest pilots may have something in common: a hazardous cognitive bias—the Dunning–Kruger effect—that hinders self-perception, clouds judgment, and leads individuals to overestimate their ability. New psychological discoveries may provide a cure.

For the criminal, an overly optimistic assessment of the skills required to rob a bank may send them directly to jail, whereas a pilot with a false assessment of their abilities may take off into a thunderstorm, continue to fly into degraded visual conditions, or attempt a hazardous circling maneuver that leads to a fatal accident.

Research into the Dunning–Kruger effect began following an attempted bank robbery that involved a unique element—lemon juice. In April 1995, McArthur Wheeler robbed two Pittsburgh banks in broad daylight without a mask. Security cameras provided good images of his five-foot-six, 270-pound body and his face. Within hours of posting images on the local news, the police had a tip and he was taken into custody.

Bank robber Wheeler—a resident of a Pittsburgh suburb—made an odd confession to detectives, exclaiming, “But I wore the juice.” Confused, the detectives pressed Wheeler on what he meant. The suspect stated that he knew lemon juice was an ingredient in invisible ink—thus he thought “logically” that if he covered his face in lemon juice, then his face would be invisible to the security cameras.

Other than his eyes burning from the juice, Wheeler was convinced that his plan would work. He was incredibly mistaken, promptly convicted, and later featured in the 1996 World Almanac as the world’s dumbest criminal.

This entry in the World Almanac is what tipped off Cornell University psychology professor David Dunning to realize that “those most lacking in knowledge and skills are the least able to appreciate that lack.” This launched a four-year study, where graduate student Justin Kruger and Dunning began a series of experiments to estimate a student’s perception of how well they’d compare to their peers based on their knowledge of a given subject.

The results found that those students who scored the lowest during the experiments had greatly exaggerated preconceived opinions on how well they would compare to others. Dunning was shocked by the results: “Those who scored near the bottom estimated that their skills were superior to two-thirds of the other students.”

Unskilled and Unaware

These experiments, both on and off campus, led the duo to publish a paper entitled, “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Awareness.” Thus, the Dunning–Kruger effect was born.

The Dunning–Kruger effect is defined as the tendency of people with low ability or skills in a specific area to give an overly positive assessment of this ability. This applies mainly to people with a low foundational skill or knowledge in a specific area trying to evaluate their competence; the theory suggests that those with low skill will overestimate their competency.

Conversely, it was found that highly skilled individuals will often underestimate their ability—thinking everyone has about the same skill level.

No one is immune from this bias; it is a systematic tendency to engage in erroneous forms of thinking and judging. According to Dunning, “We all have pockets of incompetence and errors [that] are invisible to us.”

Keep in mind that the Dunning–Kruger effect requires a minimum amount of knowledge or experience to determine “if you’re ignorant of your ignorance.” As an example, bad drivers overwhelmingly think they are good drivers. Those who don’t drive cannot make that judgment.

In aviation, Dunning–Kruger often influences the self-assessment of low-time pilots, pilots transitioning to new aircraft, pilots performing a complex task with a lack of proficiency or recent experience, or a mechanic working without assistance on a different aircraft or engine.

A classic example of the Dunning–Kruger effect is the stigma attached to the Beechcraft V-35 Bonanza nicknamed the “V-tailed doctor killer.” A physician may be incredibly competent in their medical practice, but stepping into a complex aircraft with little experience can be disastrous. Today, the Bonanza is replaced by complex single-pilot turboprops and light jets flown by upwardly mobile individuals with little flight experience.

Single-pilot Ops

The NBAA Safety Committee has identified improving the single-pilot accident rate as a top focus area, saying, “Single-pilot operations have enhanced risks when compared to multi-pilot operations, demonstrated by the fact that single-pilot aircraft are 30 percent more likely to be involved in an accident than aircraft with dual-pilot crews.”

As identified in my earlier blog about managing risk in single-pilot operations, these pilots can better manage risk by employing tools in the NBAA’s “Risk Management Guide for Single-Pilot Light Business Aircraft,” which is a great primer on the fundamentals of risk management and provides an easy-to-use flight risk assessment tool (FRAT).

The greatest benefit of a FRAT is to get a pilot in the right mindset to think about identifying, assessing, and mitigating risk, which is great if the pilot has the foundational knowledge to effectively use the tool. But if you have poor self-assessment skills, how can you effectively determine the level of risk?

The problem is when “you don’t know what you don’t know.” Compounding the issue is a natural tendency for people to think that they are “better than average;” the lowest performers during the Dunning–Kruger research felt they were much better than average.

Case in point: the 2016 crash involving a Cessna Citation CJ4 that killed a pilot and four passengers while departing in poor weather from Burke Lakefront Airport (KBKL) in Cleveland, Ohio. During this accident, both fatigue (17 hours of wakefulness) and possibly the Dunning–Kruger effect (low skills and experience) influenced the decision for this pilot to depart late at night, in marginal VMC with strong gusty winds. The accident pilot was a private pilot with approximately 1,200 hours total time, a three-week-old CE-525 type-rating certificate, and less than 10 hours as pilot-in-command of the CJ4.

What would a forensic psychologist think? It would be interesting to better understand the pilot’s self-perception and estimation of his abilities in the hours leading up to the accident. Likewise, with so little experience in the CJ4, would he be able to effectively use a FRAT to mitigate the risk? According to Dunning–Kruger, his judgment likely would have been impaired.

The NTSB determined that the probable cause of this accident was “controlled flight into terrain (CFIT) due to pilot spatial disorientation. Contributing to the accident were pilot fatigue, mode confusion related to the status of the autopilot, and negative learning transfer due to flight guidance panel and attitude indicator differences from the pilot’s previous flight experience.”

Fixing Stupid by Overcoming Overconfidence

As actor John Cleese lightheartedly explains the Dunning–Kruger effect in an online video, “If you are very, very stupid, how can you possibly realize that you’re very, very stupid?” It’s a bit of a trick: you must be relatively intelligent to realize that you’re dumb.

While Cleese’s assessment is funny, it’s not technically correct since the Dunning–Kruger effect is measured by comparing a self-assessment with objective performance, not IQ.

New discoveries in psychology are fascinating. In the mid-2000s, psychologists began to study a phenomenon called intellectual humility—a process or metacognitive entity defined as “the recognition of the limits of one’s knowledge and an awareness of one’s fallibility.”

According to researchers, it involves several components, including not thinking too highly of oneself, refraining from believing one’s views are superior to others, lacking intellectual vanity, being open to new ideas, and acknowledging mistakes and shortcomings. Individuals with higher levels of intellectual humility benefit from improved decision-making, positive social interactions, and the moderation of conflicts.

Social psychologist Scott Plous has identified the consequences of overconfidence as problematic, saying, “No problem in judgment and decision-making is more prevalent and more potentially catastrophic than overconfidence.” Beyond aircraft accidents, overconfidence has been blamed for lawsuits, wars, and stock market bubbles and crashes.

Learning or increasing intellectual humility can be supported by an individual shifting to a growth mindset of intelligence—a belief that intelligence can be developed and grow versus a trait that cannot be changed. A growth mindset supports an individual acknowledging a lack of understanding (what they don’t know) and becoming more comfortable acknowledging the intellectual strengths of others (a mentor).

Another exercise that supports intellectual humility is to think about a situation from a third-person perspective; this creates psychological distance that increases objectivity. This puts the situation in a new context. So it’s not that Jane or John cannot fly a circling approach to minimums at night; instead, it’s the fact that, through an objective analysis of the risk, it is far safer to fly a straight-in instrument approach, even if it requires changing the destination to an alternative.

It is also important to honestly assess your own limitations and understand that humans are fallible. Take an inventory of those activities that you have limited knowledge of and create a plan to learn or enhance your understanding of that topic. Also, create personal limitations if you’re new to an aircraft—if the aircraft flight manual states that the crosswind limit is 32 knots, create a personal limit of 20 or 25 knots until you gain more experience.

Critics of these theories may discount them as psychobabble. In fact, one general aviation magazine editor suggested that pilots must learn from those mistakes—even if it causes damage to an aircraft and an insurance claim.

That’s a bit irresponsible, and I’d rather make lemonade. I’d counter that a pilot never sets out to crash an aircraft or do harm to others—there was something that influenced a decision to begin or continue a flight that had catastrophic consequences.

The ability to understand that your self-perception or assessment of abilities can be blinded by simply not knowing something is powerful. It takes some humility to stop, think about the situation, and understand that if you don’t know, get another opinion or simply don’t go.

The opinions expressed in this column are those of the author and not necessarily endorsed by AIN Media Group.

Expert Opinion
True
Ads Enabled
True
Used in Print
False
Writer(s) - Credited
Newsletter Headline
AINsight: Single-pilot Ops and Lemon Juice
Newsletter Body

Psychology researchers have found that some of the dumbest criminals and the cockiest pilots may have something in common: a hazardous cognitive bias—the Dunning–Kruger effect—that hinders self-perception, clouds judgment, and leads individuals to overestimate their ability. New psychological discoveries may provide a cure.

For the criminal, an overly optimistic assessment of the skills required to rob a bank may send them directly to jail, whereas a pilot with a false assessment of their abilities may take off into a thunderstorm, continue to fly into degraded visual conditions, or attempt a hazardous circling maneuver that leads to a fatal accident.

 

Solutions in Business Aviation
0
Publication Date (intermediate)
AIN Publication Date
----------------------------