Click Here to View This Page on Production Frontend
Click Here to Export Node Content
Click Here to View Printer-Friendly Version (Raw Backend)
Note: front-end display has links to styled print versions.
Content Node ID: 428732
The explosion of applications for artificial intelligence (AI) technology hasn’t escaped the notice of aviation researchers and developers, and an interesting use case around cameras and eye tracking has surfaced. The idea is that a system that can see a pilot’s eyes and analyze the captured data can do everything from detecting and predicting cognitive overload (sometimes referred to as tunnel vision) to fatigue, motion sickness, and even diseases like Parkinson’s or brain damage. Two companies have recently unveiled research in this area, HarmonEyes and Honeywell Aerospace, and the following summarizes how this technology may find its way into a cockpit of the future.
Honeywell Tackles Sleepiness in the Flight Deck
Pilot fatigue is a long-standing safety concern, prompting innovation into how flight deck workload can be managed and how technology could provide safeguards. A 2013 study by the British Airline Pilots’ Association found that up to 56% of pilots have fallen asleep while on duty, and a further 29% of respondents admitted to having woken up to discover that their co-pilot was also asleep.
Honeywell Aerospace is addressing this through its Pilot State Monitoring solution for commercial airliners. The project forms part of the EU-backed SESAR-3 project called Digital Assistants for Reducing Workload and Increasing Collaboration (DARWIN).
The system combines real-time camera feeds with software that uses artificial intelligence (AI) to detect and process pilot facial cues and potential abnormalities. While the sleep and drowsiness element has already reached technology readiness level (TRL) 6, the ability to detect pilot incapacitation is expected to achieve the same maturity next year, according to Honeywell.
Speaking during a recent media briefing at the group’s research and development hub at Brno in the Czech Republic, Bohdan Blaha, senior software engineering supervisor of Project Darwin, explained that detecting drowsiness or incapacitation in the flight deck is significantly more difficult than with car drivers. This, he explained, is because pilots typically move their focus from instrument panels to other tasks, as well as having greater freedom to move around or even leave the aircraft’s controls.
Honeywell’s technology uses a monochromatic camera to track real-time facial features such as eye positioning, with parameters such as blinking, duration of eye closure, yawning, and overall head posture processed every 30 seconds. An AI algorithm can then detect whether the pilot is drowsy, fully asleep, or otherwise incapacitated.
Alerts Wake Up Pilots, Confidentially
Pilots can then be prompted to be more alert with aural warning alarms. In the interest of confidentiality, Honeywell’s real-time system does not share or record data from incidents. Although Honeywell has experimented with other smartwatches or other so-called “wearable” technology, Blaha explained that sharing data from these can pose privacy concerns. This approach can also be undermined if pilots forget to wear the device or if batteries are flat.
As part of ongoing evaluations, Honeywell has combined real-world data from its test aircraft with simulator-based trials. This included inviting a multitude of tired Brno employees to validate the system’s alert functionality during various phases of drowsiness.
Accessories such as sunglasses and hats were also found to have no detrimental effect on the system’s functionality. Although Blaha acknowledged that a pilot who is incapacitated by something like a medical emergency is harder to validate through simulation, the system will nevertheless be of benefit. For instance, it could be deployed in future single-pilot cockpits as an extra layer of safety protection.
Following successful testing on Honeywell’s Beech Bonanza, Falcon F900, and Boeing 757 test aircraft, the project’s scope was expanded in 2025 to include an Embraer 170 airliner. An unidentified airline has also been testing the pilot monitoring system for 18 months onboard its Airbus A321, with the potential for it to enter service after DARWIN is completed in 2026.
HarmonEyes Predicts Cognitive Overload with AI Eye Tracking
After starting a company called RightEye to develop eye-tracking software 12 years ago to take eye-tracking technology out of the laboratory into practical applications, co-founders Adam Gross and Melissa Hunfalvay launched a new division called HarmonEyes. RightEye had sold thousands of eye-tracking devices to the military, government, professional sports, hospitals, and medical users, but technology was converging to make it possible to design new eye-tracking solutions marrying ubiquitous cameras with AI tools. The RightEye hardware is too complicated to use in a closed environment like a cockpit, for example, but not so with smartphone cameras or even individual cameras.
Gross explained that the HarmonEyes software development kit (SDK) is designed not to store any eye-tracking data. As the system delivers outputs every second, it destroys the prior second’s data. “We’re not storing, collecting, or recording any of the eye-tracking data,” he said. “We’re delivering the output to the customer, and it’s up to the customer to secure consent from their pilots and their users, and they can use it the way that they want.”
From its RightEye device business, HarmonEyes had access to nearly 15 million unique records of eye movements that signal “performance-based metrics like cognitive load…and fatigue and brain health signatures like Parkinson’s disease and traumatic brain injury,” Gross explained. “We’ve got this massive data set that we can train AI models on. But the real turning point was that the underlying technology that allowed you to extract an eye-tracking signal is mostly camera-based. Ten years ago, we needed kind of bespoke hardware for that. Now, whether it be a webcam on your PC, a phone, a tablet, a mixed-reality headset in the cockpit, those cameras are better, faster, cheaper, and we can extract the gaze vectors or the eye-tracking signal from it.”
This resulted in the development of three performance-related tools using AI, addressing cognitive load, fatigue, and motion sickness. Gross explained cognitive load as “the level of mental effort when someone’s performing a task and then measuring the reserve that someone’s got in capacity to deal with surprises or new things coming up.”
Where AI helps is not just in analyzing the eye-tracking data but in creating predictive models. The goal wasn’t to help customers create a reactive solution if, say, a pilot is overloaded and fixates on the wrong task at the wrong time. “If I just tell you when you are in cognitive overload, which is a risky scenario to be in if you’re flying a plane, then it’s too late. You’re already overloaded,” he said.
“What we do is leverage AI and time-series data, and we create predictive models so we deliver a time predictor of when you’re going to reach an overload state, a high level of fatigue, and even motion sickness. That’s our differentiator.”
Knowing that a person is heading toward what Gross called “a problematic state” allows for possible interventions, preferably before cognitive overload sets in. “If someone’s overloaded, you could deflect a decision to the copilot,” he said. “You can have a system intervention [such as haptic feedback, aural warnings, or other technology led by the AI agent]. It depends on the context of what’s happening. But having that insight, knowing that someone’s going to reach a problematic state, the stakeholder can deliver that intervention, and there could be several different interventions. If you can eliminate the dual- or triple-tasking elements of what’s happening at the moment in time, right then, you can increase the capacity and the reserve that a pilot has mentally.”
In a training environment, an instructor could choose to let a high-cognitive-load situation play out, then address the problem in the debriefing and retrain to prevent it.
Eventually, HarmonEyes plans to offer AI agents that will be integrated into the detection system and deliver an autonomous intervention. Another case of this technology would be to use the data to compare expert and novice pilot profiles, which could help guide training and also recruitment programs.
Extending this idea further, HarmonEyes could evaluate exactly where in the cockpit an experienced, unfatigued pilot looks while in a moderately cognitively loaded state, to verify they are viewing the right instruments or displays at the right time.
HarmonEyes is already working with NASA, the military, and airlines, as well as Formula 1 driver training simulation programs.
“Where we really add value is having the onboard system understand the context of the pilot in their situation,” Gross said. “It’s like contextual AI. It’s understanding the environment, the task, the situation, and then the pilot, and when you put those three things together, you get true contextual AI, and you can start to imagine the value in terms of not only being able to deliver elite performance but also avoiding catastrophic events.”
The explosion of applications for artificial intelligence (AI) technology hasn’t escaped the notice of aviation researchers and developers, and an interesting use case around cameras and eye tracking has surfaced. The idea is that a system that can see a pilot’s eyes and analyze the captured data can do everything from detecting and predicting cognitive overload (sometimes referred to as tunnel vision) to fatigue, motion sickness, and even diseases like Parkinson’s or brain damage. Two companies have recently unveiled research in this area, HarmonEyes and Honeywell Aerospace, and the following summarizes how this technology may find its way into a cockpit of the future.
Honeywell Tackles Sleepiness in the Flight Deck
Pilot fatigue is a long-standing safety concern, prompting innovation into how flight deck workload can be managed and how technology could provide safeguards. A 2013 study by the British Airline Pilots’ Association found that up to 56% of pilots have fallen asleep while on duty, and a further 29% of respondents admitted to having woken up to discover that their co-pilot was also asleep.
Honeywell Aerospace is addressing this through its Pilot State Monitoring solution for commercial airliners. The project forms part of the EU-backed SESAR-3 project called Digital Assistants for Reducing Workload and Increasing Collaboration (DARWIN).
The system combines real-time camera feeds with software that uses artificial intelligence (AI) to detect and process pilot facial cues and potential abnormalities. While the sleep and drowsiness element has already reached technology readiness level (TRL) 6, the ability to detect pilot incapacitation is expected to achieve the same maturity next year, according to Honeywell.
Speaking during a recent media briefing at the group’s research and development hub at Brno in the Czech Republic, Bohdan Blaha, senior software engineering supervisor of Project Darwin, explained that detecting drowsiness or incapacitation in the flight deck is significantly more difficult than with car drivers. This, he explained, is because pilots typically move their focus from instrument panels to other tasks, as well as having greater freedom to move around or even leave the aircraft’s controls.
Honeywell’s technology uses a monochromatic camera to track real-time facial features such as eye positioning, with parameters such as blinking, duration of eye closure, yawning, and overall head posture processed every 30 seconds. An AI algorithm can then detect whether the pilot is drowsy, fully asleep, or otherwise incapacitated.
Alerts Wake Up Pilots, Confidentially
Pilots can then be prompted to be more alert with aural warning alarms. In the interest of confidentiality, Honeywell’s real-time system does not share or record data from incidents. Although Honeywell has experimented with other smartwatches or other so-called “wearable” technology, Blaha explained that sharing data from these can pose privacy concerns. This approach can also be undermined if pilots forget to wear the device or if batteries are flat.
As part of ongoing evaluations, Honeywell has combined real-world data from its test aircraft with simulator-based trials. This included inviting a multitude of tired Brno employees to validate the system’s alert functionality during various phases of drowsiness.
Accessories such as sunglasses and hats were also found to have no detrimental effect on the system’s functionality. Although Blaha acknowledged that a pilot who is incapacitated by something like a medical emergency is harder to validate through simulation, the system will nevertheless be of benefit. For instance, it could be deployed in future single-pilot cockpits as an extra layer of safety protection.
Following successful testing on Honeywell’s Beech Bonanza, Falcon F900, and Boeing 757 test aircraft, the project’s scope was expanded in 2025 to include an Embraer 170 airliner. An unidentified airline has also been testing the pilot monitoring system for 18 months onboard its Airbus A321, with the potential for it to enter service after DARWIN is completed in 2026.
HarmonEyes Predicts Cognitive Overload with AI Eye Tracking
After starting a company called RightEye to develop eye-tracking software 12 years ago to take eye-tracking technology out of the laboratory into practical applications, co-founders Adam Gross and Melissa Hunfalvay launched a new division called HarmonEyes. RightEye had sold thousands of eye-tracking devices to the military, government, professional sports, hospitals, and medical users, but technology was converging to make it possible to design new eye-tracking solutions marrying ubiquitous cameras with AI tools. The RightEye hardware is too complicated to use in a closed environment like a cockpit, for example, but not so with smartphone cameras or even individual cameras.
Gross explained that the HarmonEyes software development kit (SDK) is designed not to store any eye-tracking data. As the system delivers outputs every second, it destroys the prior second’s data. “We’re not storing, collecting, or recording any of the eye-tracking data,” he said. “We’re delivering the output to the customer, and it’s up to the customer to secure consent from their pilots and their users, and they can use it the way that they want.”
From its RightEye device business, HarmonEyes had access to nearly 15 million unique records of eye movements that signal “performance-based metrics like cognitive load…and fatigue and brain health signatures like Parkinson’s disease and traumatic brain injury,” Gross explained. “We’ve got this massive data set that we can train AI models on. But the real turning point was that the underlying technology that allowed you to extract an eye-tracking signal is mostly camera-based. Ten years ago, we needed kind of bespoke hardware for that. Now, whether it be a webcam on your PC, a phone, a tablet, a mixed-reality headset in the cockpit, those cameras are better, faster, cheaper, and we can extract the gaze vectors or the eye-tracking signal from it.”
This resulted in the development of three performance-related tools using AI, addressing cognitive load, fatigue, and motion sickness. Gross explained cognitive load as “the level of mental effort when someone’s performing a task and then measuring the reserve that someone’s got in capacity to deal with surprises or new things coming up.”
Where AI helps is not just in analyzing the eye-tracking data but in creating predictive models. The goal wasn’t to help customers create a reactive solution if, say, a pilot is overloaded and fixates on the wrong task at the wrong time. “If I just tell you when you are in cognitive overload, which is a risky scenario to be in if you’re flying a plane, then it’s too late. You’re already overloaded,” he said.
“What we do is leverage AI and time-series data, and we create predictive models so we deliver a time predictor of when you’re going to reach an overload state, a high level of fatigue, and even motion sickness. That’s our differentiator.”
Knowing that a person is heading toward what Gross called “a problematic state” allows for possible interventions, preferably before cognitive overload sets in. “If someone’s overloaded, you could deflect a decision to the copilot,” he said. “You can have a system intervention [such as haptic feedback, aural warnings, or other technology led by the AI agent]. It depends on the context of what’s happening. But having that insight, knowing that someone’s going to reach a problematic state, the stakeholder can deliver that intervention, and there could be several different interventions. If you can eliminate the dual- or triple-tasking elements of what’s happening at the moment in time, right then, you can increase the capacity and the reserve that a pilot has mentally.”
In a training environment, an instructor could choose to let a high-cognitive-load situation play out, then address the problem in the debriefing and retrain to prevent it.
Eventually, HarmonEyes plans to offer AI agents that will be integrated into the detection system and deliver an autonomous intervention. Another case of this technology would be to use the data to compare expert and novice pilot profiles, which could help guide training and also recruitment programs.
Extending this idea further, HarmonEyes could evaluate exactly where in the cockpit an experienced, unfatigued pilot looks while in a moderately cognitively loaded state, to verify they are viewing the right instruments or displays at the right time.
HarmonEyes is already working with NASA, the military, and airlines, as well as Formula 1 driver training simulation programs.
“Where we really add value is having the onboard system understand the context of the pilot in their situation,” Gross said. “It’s like contextual AI. It’s understanding the environment, the task, the situation, and then the pilot, and when you put those three things together, you get true contextual AI, and you can start to imagine the value in terms of not only being able to deliver elite performance but also avoiding catastrophic events.”