Artificial intelligence is no longer just a futuristic buzzword in aviation—it's rapidly reshaping the skies above us. From streamlining aircraft maintenance to pushing the boundaries of flight automation, AI is becoming a cornerstone of modern aviation. But the question remains: how can the industry harness AI’s power while navigating challenges like safety, trust, and the sheer complexity of flight?
During a panel discussion on the eve of NBAA-BACE 2024, Honeywell Aerospace chief technology officer Todd Giles set the stage by highlighting AI’s important role within the aviation industry. “Artificial intelligence has transformed from the realm of science fiction to becoming an integral part of our daily lives and industries,” Giles said. AI is now an integral part of aircraft design, manufacturing, and maintenance, driving advancements like predictive maintenance and improved operational efficiency.
From enhancing pilot decision-making to increasing productivity across business functions, AI’s influence is growing rapidly. However, Giles also cautioned that AI is often discussed out of context, creating confusion about its real potential and applications.
Pervinder Johar, CEO of Avathon, expanded on the various ways AI is shaping aviation: “Think of three different buckets: [...] managing aging infrastructure, improving safety and quality in manufacturing, and supporting future autonomy." He pointed out the significant challenges posed by aging aviation infrastructure—assets that AI can help preserve through prescriptive maintenance.
AI technologies such as computer vision are also enhancing the ability to inspect aircraft components that are otherwise difficult for humans to assess, thus improving safety and efficiency.
Johar envisions AI playing a crucial role in managing the anticipated surge in urban air traffic with air taxis and drones in the future. By 2030, AI will be essential for scaling operations to meet the demands of a growing aviation market.
AI’s role in assisting pilots during flight is also coming into focus. Trung Pham, chief scientist for AI and machine learning at the FAA, noted that while AI hasn't yet reached the point of fully operating aircraft, it is already making strides in augmented intelligence. “With an AI system, the system can see more than what we can focus on and inform us in a role of monitoring what's going on—and inform us of certain precursors that can lead to accidents or incidents,” he said.
Pham stressed that AI’s strength lies in its ability to process and analyze vast amounts of information, improving situational awareness during flights. He also highlighted AI’s post-flight capabilities, such as analyzing flight data to identify patterns that could enhance safety and efficiency in future operations. Pham underscored that human pilots remain essential for the foreseeable future, with AI playing a supportive role.
Matt George, CEO of Merlin, took a bolder stance on AI’s future in aviation, particularly regarding autonomy. He acknowledged that even the best pilots make mistakes and that automation could reduce human error. He asked pilots in the audience to raise their hands, and then said, “Put your hands down if you have not done something really stupid in an airplane that's almost killed you.” Noting how few hands had gone down, he said, “Even some of the best pilots in the world still have that same ratio.”
George traced the history of flight deck crew reductions—from five members after World War II to today’s two-person crews—and argued that recent technological advancements could further reduce this number.
Merlin, backed by Google, is developing a nonhuman pilot system to serve as an autonomous third "pilot" in the cockpit, with the goal of eventually enabling single-pilot operations. George emphasized the need for a gradual, responsible transition to autonomy, noting that Merlin is already working with the U.S. military to reduce crew size on aircraft like the C-130J and KC-135R.
For automation and AI, safety has to remain the top priority. Pham reiterated that aviation's regulatory framework has been built on decades of safety improvements, and AI must meet these high standards. He pointed out that AI, unlike traditional systems, is trained rather than engineered, and this new development model requires additional testing to build public trust. Balancing innovation with safety is a major challenge, especially as public perception often demands perfection.
Pham cautioned against overestimating AI’s current capabilities, comparing the situation to NASA's Apollo program, in which a monkey was trained to perform basic tasks in space. “Would the monkey survive with something similar to the artificial intelligence that we have now, or does it require more than artificial intelligence to handle cases that we haven't seen before?”
Data is another critical challenge in AI’s aviation journey. Johar emphasized that high-quality data is essential for AI development, but accessing and sharing data across the industry is often difficult due to privacy and regulatory concerns.
AI systems, particularly those focused on perception, require vast amounts of data for training—data that isn’t always available in the real world. To overcome this, companies are increasingly turning to synthetic data to simulate various scenarios for AI systems. Johar also highlighted the potential for AI to collect new types of data, such as sound and vibration, which could help identify issues before they become safety risks.