Tony Kern, for 28 years the influential speaker on the first day of Bombardier Safety Standdown and founding partner and CEO of Convergent Performance, challenged this year’s Standdown audience. “As you listen to me, as you listen to all the other speakers, I want you to think about reversing this theme a little bit, and ask yourself, what influences you? Who influences you?”
Tying together the influence theme and how business aviation professionals can help each other with safety issues related to preparedness, Kern illustrated his points with Venn diagrams showing the intersection between unexpected events and lack of vigilance, complacency, and apathy. “There’s a lot of shade in that Venn diagram,” he said. “What if we could replace that? What if we could influence ourselves? What if we could influence others and replace that apathy and complacency? No one is apathetic or complacent overall; it goes up and down. Sometimes we're apathetic, sometimes we're bored, sometimes we're complacent, and other times we're hypervigilant.”
What Kern was trying to point out is that bolstering vigilance can help prevent accidents or prepare pilots to handle the unexpected. “I want to think about…moving that spectrum a little bit and recognizing when we begin to drift. Now we get that unexpected event, but it's not so unexpected. [Shift] to we’ve touched the switches [with our eyes closed], we know where the [emergency] door handle is, and we can keep [accidents or issues] at bay.”
Where’s the Risk?
An important factor to understand, Kern emphasized, “is that the risk is not where you think it is.” He illustrated this point by showing videos of heroic sports plays—for example, New York Mets baseball infielder Luis Guillorme’s famous midair catch of a bat inadvertently let loose by the batter. He further highlighted this point during the rest of his presentation by throwing balls into the audience, which likely kept attendees on high alert or hypervigilant.
Risk in aviation is primarily due to the human factor, whether it’s pilots, mechanics, or engineers. Yet, he added, “Not only is aviation incredibly safe, but we are.” We must avoid lying to ourselves about safety and assuming that just because we say aviation is safe, no more work needs to be done to ensure safety.
“We have to understand that there are accidents lurking inside of us, waiting for the holes in the Swiss cheese to line up to where we're the last piece that can prevent it,” Kern said. “We have to understand it can happen to us, and to some extent, we have to believe that it will happen to us. We have to attack this idea of how incredibly safe aviation is inside our own noggins. We bring our lunch bucket to work. We fix an airplane. We climb in an airplane. Everything goes well. We land. Customers are happy, we're happy, and we go home.”
But if that’s the case, little useful learning is happening. “There's so much we can learn from every single flight, every single pre-brief and debrief, everything, and we have to [do that],” he said.
“One of the things that influences us is this idea that we're pretty good, right? Which is good, confidence is good, don't get me wrong. But we are never good enough. Are you tough enough to take a hit, professionally, personally, physically, and keep moving? Are you tough enough to not let your feelings get in the way when you don't get a job or somebody gets promoted ahead of you, or something else happens in your organization or your life? Are you tough enough to realize what I just said is true, that you can never be prepared enough because you don't know what level the challenge is at? And have the courage to offer constructive, honest, objective criticism to others, but also when you look in the mirror? Are we tough enough to do that?”
Black Swan Lesson
This was Kern’s “most important lesson” of the day: “Things that have never happened before to you or to people happen all the time. These black swan events that we never think can happen to us—that the odds are astronomical, won't happen to us—they happen all the time to somebody and we have to be able to evaluate this concept.
“We get our type ratings, we go through training, we meet the minimum standards, and we fly for thousands of hours and years, and nothing bad happens, when we begin to get this idea that we're good enough. Every once in a while, something like this comes up that reminds us we might not be, and things can turn in a minute. I don't know how many of you ever had a serious emergency. When you do, you're going to realize that your brain doesn't work the same, but when you've got something to fall back on…I trained with fog goggles. I know where the switches are. I know where the hatch is. I'm going to survive this. You've got somewhere to go.”
What this means is being prepared, he explained. “Preparation is getting ready for something you may or may not ever encounter, and then you've got to couple that with the state of mind that says it could happen right now. So preparation, plus that vigilance that says I'm going to be here now, I'm ready for it, equals some level of what I want to call readiness. But the most important thing to understand is that readiness can only be determined after the challenge arises, and it may or may not be the one that we train for in the simulator. It may not be the one that has happened enough times to create a trend in our safety management systems. It may be something completely odd and strange, so the best thing we can do is to constantly look at the levels of preparedness that we have.
“And there’s a lot of elements to that. It's aircraft knowledge and systems knowledge. It's empathy with your team, being able to understand the tone of voice, whether that air traffic controller is getting stressed out and might do something silly. All of those things come to play into our level of preparedness.”
Hypervigilance
There is more to this, however, and Kern believes not just in vigilance but hypervigilance. He doesn’t agree with the Webster dictionary definition: the state of being abnormally alert. To Kern, hypervigilance means being “normally alert for abnormal things to occur.”
The seemingly normal act of aviating remains inherently risky. “Our field of play has always been dangerous,” he said, “even when we're sitting around hangars and flying airplanes. Never forget what an airplane is. It’s an aluminum eggshell that has highly combustible materials in it with all kinds of kinetic energy and fire happening right beside us, right underneath us, and we take it up to altitudes where the time of useful consciousness can be measured in seconds. We navigate around thunderstorms, where lightning is as hot as the surface of the sun. And then we come down, we land on little, tiny strips of concrete, and we call it normal. Yeah, that's not normal. That's a high-risk environment.”
Kern asked, “So how do we get this hypervigilance?” People who exhibit this trait “see deeper, and they're able to attach meaning to the things that they see. That's a conscious process. It's not going to be intuitive when you see something new that you're going to figure out how to process what to do with it. That requires you to think. That's a conscious act of diving deeper.
“Experts not only know more; they sustain this state of hypervigilance. It's a mental process of always preparing, never knowing what the challenge is going to be, and staying fully aware that things could go south at any moment. And staying humble about it, because the only way you learn is when you can check your ego at the door.
“We can always learn. We can always get better. What prevents us is the expert's curse of reading the press clippings that say we're safe enough or resting on our laurels because we're really good at what we do. You don't self-select to a safety conference unless you care about getting better. So I know I'm kind of preaching to the choir here, but can you take that same exuberance for learning, applying, and sharing back and demonstrate it for others?
A Burning Desire
“Expert readiness doesn't happen by accident. You've got to have this burning desire to get better, and being prepared means that constant unease that says, no matter how good I am, that next challenge may be a little better than I'm ready for, and then when that moment arrives and you execute—win or lose—you’re kind of happy because I was preparing all along, and finally, I got challenged, and now I can learn a little bit more and even get better.
"So fully comprehend that in that box up there practicing precision and picturing perfection is where true readiness lies. Now, how do we take it out? The first thing you need to realize is that when we influence others, we get better because of it. Every time we sit down with a group of people and have a discussion and try to impart our wisdom and learn things from them, we are lifting our own bar. We are holding ourselves to a higher standard. You need to understand that it's not an act of altruism to help somebody out. It's self-serving as well. We can get better. We can become more expert when we try to help others.”
Kern wrapped up his session by reminding the audience of their importance. “Think about how fortunate you are to be one of a handful of people to be part of this industry. We fly in this little atmosphere on this little teeming-full-of-life planet. Never think you're not important. Never, ever think that you're too small to make a difference because we get this one life to make a difference and influence us forever.
“We have no idea where it's going to ripple. Something I say you may take and change a little bit. Say it to somebody else, they may take it, change a little bit, and it just goes out there. It's how we get better as a species, it's how we get better as an industry, it's how we get better as a company, it's how we get better as a team, it's how we get better as a family, and it's how we get better as individuals, because we want to get better, and we want other people to get better.”
It’s important, Kern said, to “realize that hypervigilance, which is expecting abnormal things to occur and being ready to deal with them, is a core competency for the chaos that we're flying in right now. “Realize that experts see a different world. Are you seeing that world or are you still looking through the lens of the status quo? Never forget, somebody's always watching you, listening to you, and taking cues from you. Be the example that they need.”
Tony Kern, for 28 years the influential speaker on the first day of Bombardier Safety Standdown and founding partner and CEO of Convergent Performance, challenged this year’s Standdown audience on the event theme of “Elevate Your Influence.”
“I want you to think about reversing this theme a little bit, and ask yourself, what influences you? Who influences you?”
Tying together the influence theme and how business aviation professionals can help each other with safety issues related to preparedness, Kern illustrated his points with Venn diagrams showing the intersection between unexpected events and lack of vigilance, complacency, and apathy.
“There’s a lot of shade in that Venn diagram,” he said. “What if we could replace that? What if we could influence ourselves? What if we could influence others and replace that apathy and complacency? No one is apathetic or complacent overall; it goes up and down. Sometimes we're apathetic, sometimes we're bored, sometimes we're complacent, and other times we're hypervigilant.”
What Kern was trying to point out is that bolstering vigilance can help prevent accidents or prepare pilots to handle the unexpected: “moving that spectrum a little bit and recognizing when we begin to drift. Now we get that unexpected event, but it's not so unexpected. [Shift] to we’ve touched the switches [with our eyes closed], we know where the [emergency] door handle is, and we can keep [accidents or issues] at bay.”
Where’s the Risk?
An important factor to understand, Kern emphasized, “is that the risk is not where you think it is.” He illustrated this point by showing videos of heroic sports plays—for example, New York Mets baseball infielder Luis Guillorme’s famous midair catch of a bat inadvertently let loose by the batter.
Risk in aviation is primarily due to the human factor, whether it’s pilots, mechanics, or engineers. Yet, he added, “Not only is aviation incredibly safe, but we are.” We must avoid lying to ourselves about safety and assuming that just because we say aviation is safe, no more work needs to be done.
“We have to understand that there are accidents lurking inside of us, waiting for the holes in the Swiss cheese to line up to where we're the last piece that can prevent it,” Kern said. “We have to understand it can happen to us, and to some extent, we have to believe that it will happen to us. We have to attack this idea of how incredibly safe aviation is inside our own noggins.”
But if that’s the case, little useful learning is happening. “There's so much we can learn from every single flight, every single pre-brief and debrief, everything, and we have to [do that],” he said.
“One of the things that influences us is this idea that we're pretty good, right? Which is good, confidence is good. But we are never good enough. Are you tough enough to take a hit, professionally, personally, physically, and keep moving? Are you tough enough to not let your feelings get in the way when you don't get a job or somebody gets promoted ahead of you, or something else happens in your organization or your life? Are you tough enough to realize what I just said is true, that you can never be prepared enough because you don't know what level the challenge is at? And have the courage to offer constructive, honest, objective criticism to others, but also when you look in the mirror?”
Black Swan Lesson
This was Kern’s “most important lesson” of the day: “Things that have never happened before to you or to people happen all the time. These black swan events that we never think can happen to us—that the odds are astronomical, won't happen to us—they happen all the time to somebody, and we have to be able to evaluate this concept.
“We get our type ratings, we go through training, we meet the minimum standards, and we fly for thousands of hours and years, and nothing bad happens, when we begin to get this idea that we're good enough. Every once in a while, something like this comes up that reminds us we might not be, and things can turn in a minute. I don't know how many of you ever had a serious emergency. When you do, you're going to realize that your brain doesn't work the same, but when you've got something to fall back on…I trained with fog goggles. I know where the switches are. I know where the hatch is. I'm going to survive this. You've got somewhere to go.”
What this means is being prepared, he explained. “Preparation is getting ready for something you may or may not ever encounter, and then you've got to couple that with the state of mind that says it could happen right now. So preparation, plus that vigilance that says I'm going to be here now, I'm ready for it, equals some level of what I want to call readiness. But the most important thing to understand is that readiness can only be determined after the challenge arises, and it may or may not be the one that we train for in the simulator. It may not be the one that has happened enough times to create a trend in our safety management systems. It may be something completely odd and strange, so the best thing we can do is to constantly look at the levels of preparedness that we have.
“And there’s a lot of elements to that. It's aircraft knowledge and systems knowledge. It's empathy with your team, being able to understand the tone of voice, whether that air traffic controller is getting stressed out and might do something silly. All of those things come to play into our level of preparedness.”
Hypervigilance
There is more to this, however, and Kern believes not just in vigilance but hypervigilance. He doesn’t agree with the Webster dictionary definition: the state of being abnormally alert. To Kern, hypervigilance means being “normally alert for abnormal things to occur.”
The seemingly normal act of aviating remains inherently risky. “Our field of play has always been dangerous,” he said, “even when we're sitting around hangars and flying airplanes. Never forget what an airplane is. It’s an aluminum eggshell that has highly combustible materials in it with all kinds of kinetic energy and fire happening right beside us, right underneath us, and we take it up to altitudes where the time of useful consciousness can be measured in seconds. We navigate around thunderstorms, where lightning is as hot as the surface of the sun. And then we come down, we land on little, tiny strips of concrete, and we call it normal. Yeah, that's not normal. That's a high-risk environment.”
Kern asked, “So how do we get this hypervigilance?” People who exhibit this trait “see deeper, and they're able to attach meaning to the things that they see. That's a conscious process. It's not going to be intuitive when you see something new that you're going to figure out how to process what to do with it. That requires you to think. That's a conscious act of diving deeper.
“We can always learn. We can always get better. What prevents us is the expert's curse of reading the press clippings that say we're safe enough or resting on our laurels because we're really good at what we do. You don't self-select to a safety conference unless you care about getting better. So I know I'm kind of preaching to the choir here, but can you take that same exuberance for learning, applying, and sharing back and demonstrate it for others? Expert readiness doesn't happen by accident. You've got to have this burning desire to get better.”