SEO Title
AINsight: Now Everywhere, Can AI Improve Aviation Safety?
Subtitle
A simple query about improving aviation safety to ChatGPT yielded 10 solid answers, but they also require critical thinking to delve further.
Subject Area
Channel
Teaser Text
A simple query about improving aviation safety to ChatGPT yielded 10 solid answers, but they also require critical thinking to delve further.
Content Body

Artificial intelligence (AI) applications have created a buzz on the internet and with investors, and have the potential to transform the aviation industry. From flight data analytics to optimized route and fuel planning applications, AI, in its infancy, is making an impact on aviation—at least operationally. But can it improve aviation safety?

Natural language AI chatbots such as ChatGPT, according to technology publication Digital Trends, “continue to dazzle the internet with AI-generated content, morphing from a novel chatbot into a piece of technology that is driving the next era of innovation.” In a mixed outlook, one article states, “No tech product in recent memory has sparked as much interest, controversy, fear, and excitement.” 

First launched as a prototype in November 2022, ChatGPT quickly grew to more than 100 million users by January. Last month, traffic grew by more than 54 percent and is closing in on one billion unique users every month.

ChatGPT is a chatbot built on what is called a large language model (LLM). According to Digital Trends, “These neural networks are trained on huge quantities of information from the internet for deep learning—meaning they generate altogether new responses, rather than regurgitating specific canned responses.”

While most adults in the U.S. have heard about ChatGPT, only 14 percent have used it. To learn more about what all the hype is about, I asked ChatGPT a few aviation-related questions. It’s a bit tongue-in-cheek, but this was an exercise to satisfy my curiosity and to see if the responses were either accurate or innovative.

For fun, I submitted the following prompt to ChatGPT: “How can we improve aviation safety?”

In a matter of seconds, the bot generated a tidy response with an opening statement and 10 key safety measures. It acknowledged that “aviation safety is a critical concern, and there are several ways to improve it.”

Of those 10 safety measures, there were seven categories that included enhanced training and education for pilots, maintenance personnel, and air traffic controllers; strengthening of safety regulations; implementation of advanced technology, including AI; investment in research and development, including ATC and NAS modernization; improved data sharing; comprehensive safety audits; and the fostering of a safety culture that includes improved reporting and communications.  

A brief description of each safety measure was provided. Most relied on overused jargon or buzzwords such as safety culture, data sharing, and best practices. (For the college student experimenting with generative AI, here is a pro tip: provide a little context to each term.)

In general, a lot of these are “white cape” safety measures that are easier to talk about than to implement. As an example, improved safety regulations often fall victim to powerful lobbying groups in Washington, D.C. A great example of a regulation that was influenced by lobbyists are the more proactive science-based Part 117 duty and rest rules that applied to all Part 121 operators, except cargo airlines. 

Under the implementation of advanced technologies, there was some serious self-promotion of AI by stating, “Utilize automation and artificial intelligence to enhance decision-making processes, reduce human error, and provide real-time safety information.”

Agreed, these are the areas where AI will shine. Recently, the Notam Alliance—a team of notam end-users, pilots, dispatchers, airlines, and other aircraft operators—ran an exercise to create a “super notam” that helps solve issues with the readability and useability of notams. The group used ChatGPT to see if notams could be understood by a machine; the results were promising—during this demonstration, AI could understand a notam more than 98 percent of the time. (More on this in an upcoming AIN article.)

So can AI improve aviation safety? The short answer is yes, but along the way there will be appropriate applications of AI and it will continue to create “interest, controversy, fear, and excitement.” According to the ChatGPT response, “It’s important to note that aviation safety is an ongoing process that requires continuous improvement, vigilance, and collaboration among all stakeholders in the aviation industry.”

It’s also important to note that AI is not the end of humanity. For those humans with critical thinking skills and the ability to use prior experiences to perform complex tasks—pilots, safety professionals, and technical writers—the future is still bright.

The opinions expressed in this column are those of the author and are not necessarily endorsed by AIN Media Group.

Expert Opinion
True
Ads Enabled
True
Used in Print
False
Writer(s) - Credited
Publication Date (intermediate)
AIN Publication Date
----------------------------