The European Union Aviation Safety Agency (EASA) recently published Issue 2 of its Artificial Intelligence Concept Paper. Notably, it presents foundational concepts that “are crucial for the safe and trustworthy development and implementation of AI technologies in aviation,” according to EASA.
While the concepts explored in the paper seek to advance artificial intelligence (AI) applications in aviation, EASA wants to ensure that humans remain involved in these developments. In the paper, EASA refines guidance for level 1 AI applications and provides guidance for level 2 AI-based systems.
Level 1 AI applications are “those enhancing human capabilities” and they deepen “the exploration of learning assurance, AI explainability, and ethics-based assessment,” EASA explained. “Level 2 AI introduces the groundbreaking concept of human-AI teaming (HAT), setting the stage for AI systems that automatically take decisions under human oversight.”
EASA released the concept paper to help those who are applying for certification of safety- or environment-related applications that will use AI or machine learning technologies, in areas covered by the EASA Basic Regulation. EASA published its Artificial Intelligence Roadmap 2.0 last May, and this is a living document that is updated regularly as AI development continues, “through discussions and exchanges of views, but also, practical work on AI development in which the Agency is already engaged.”
Some aviation use cases that EASA addresses in the Concept Paper include a visual landing guidance system, radio frequency suggestions, computer vision-based auto-taxi, and data acquisition.
Daedalean has been working on the visual landing system (VLS), a Level 1A AI-based system that consists of a high-resolution camera looking forward from the aircraft. “The VLS provides landing guidance for Part 91 aircraft on hard-surface runways in daytime visual meteorological conditions (VMC),” according to the concept paper.
Its system can recognize and track hard-surface runways, which can be selected by the pilot or via pre-configuration in a flight plan. Mimicking an ILS, "the VLS provides the position of the aircraft in the runway coordinate frame as well as horizontal and vertical deviations from a configured glide slope.” It also provides uncertainty and validity flags.
A Level 1B application would use voice recognition to recognize an air traffic controller’s radio call and then suggest a frequency change to the flight crew. "The application is expected to reduce workload or help the pilot to confirm the correct understanding of a radio frequency in conditions of poor audio quality,” according to the report.
Developments in the Level 2 arena include computer vision-based auto taxi systems that could detect and avoid ground obstacles. The highest level—2B—discusses the pilot and AI teaming to solve problems.
An example of a level 2B system is the Proxima virtual copilot, which could facilitate single-pilot operations. Proxima uses aircraft systems and displays to adjust how it supports the pilot, and it can also monitor the pilot’s mental and physical state, detect the pilot’s workload, and—by monitoring the pilot’s communications and data link messaging—monitor “aircraft position to ensure appropriate flight path management, and intervene where appropriate.”
It could also automatically configure the aircraft by extending the landing gear, for example; oversee navigation and communication; and identify and manage failure scenarios.
According to the concept paper, “Where practicable, the document identifies anticipated means of compliance and guidance material that could be used to comply with those objectives.”