Gaze tracking: military instructors take a fresh look at pilot training

Aviation Technology and Innovation an_instructor_using_seeing_machines_to_monitor_a_pilot_gaze.jpg
Seeing Machines

During a flight, pilots must process an enormous amount of information and act accordingly. They have to be able to focus their attention on the right cockpit instrument at the right time, while balancing an internal and external scan. 

The level of situational awareness required becomes increasingly critical for military flight crews. This includes fighter pilots, who are confronted with dangerous and potentially hostile situational environments providing a complex and somewhat different scenario to their commercial counterparts. 

Pilots, both military and commercial, gather up to 90% of the information necessary to their task and performance visually. The acquisition of gaze reflexes during training begins on day one and the observation pattern becomes an important factor for an instructor to monitor. 

Traditionally, instructors have experienced difficulty assessing a pilot’s scan from sitting at the back of the simulator. The increased use of Head-Up Displays (HUDs), particularly in military, single-seat fighter aircraft with a representative simulator, amplifies this challenge.

If instructors can better understand pilots’ gaze and scan behavior, then they have a greater capability to interpret and provide guidance to trainee pilots to improve overall training efficacy. Military flight training introduces different elements compared to civilian flight training, such as tactical flying, an accelerated syllabus, typified by a steep learning curve with a high historical attrition rate. A contributing factor to this failure rate during flight training is suboptimal monitoring of instruments and the environment by trainee pilots. 

Through its Jericho ‘fifth-generation’ innovation program, the Royal Australian Air Force sought to scientifically measure the effectiveness of eye-tracking-based training in a military operational flight training environment. Australian company Seeing Machines, an industry leader in computer vision and operator monitoring technology, was selected as the solution provider for the Jericho initiative.  

The training environment selected for the evaluation was the RAAF’s more advanced Lead-In Fighter Training, conducted on the Hawk 127 jet. Complementing live flying, a significant component of flight training, is conducted in the Air Force’s three CAE-manufactured state-of-the-art Flight Training Devices (FTD), also called Full Mission Simulators.

Seeing Machines installed its Crew Training Systems in two FTDs and, over the course of 18 months and two Introductory Fighter Courses, objectively and subjectively measured the utility of eye-tracking based training aids. An important requirement for training fidelity and certification was that the Crew Training System (CTS) was non-intrusive and non-wearable (totally off-body) and could accurately and reliably track pilots’ gaze in real-time on the Instructor Operator Station. It should also be stored as a video file for later debrief opportunities.

(Source: Seeing Machines)

Results have shown that the CTS helped instructors move from anecdotal evidence to factual evidence to evaluate where the trainees were looking, and also helped to understand their decision-making, as Seeing Machines Aviation General Manager Patrick Nolan explained to AeroTime. 

“One Qualified Flying Instructor (QFI) watched a scenario playback without eye-tracking and remarked that they may have failed that trainee based on their observed performance.  The accuracy was not at the required standard in terms of speed, height, and heading maintenance – all signs of cognitive overload,” Nolan said. “We then showed the instructor the same scenario playback but with the trainees’ gaze tracking overlaid on the instruments. 

Nolan continued: “In this specific scenario the QFI was open to an alternative decision because what he could see with this additional information was that the trainees’ performance issues were not related to their cognitive load. The student pilot actually had a very good, structured, and disciplined scan pattern, and was not cognitively overloaded, but simply required some more time in the Hawk simulator getting comfortable from a ‘hands and feet’ coordination perspective.” 

He added: “So, instead of failing the trainee, adding a remedial training package, and adding significant cost, the instructor determined that he would give this trainee an extra 10-15 minutes of free flight on his own to get more comfortable flying the aircraft.”

While Jericho’s scope of research was on gaze tracking, Seeing Machines continue to develop features using signals that CTS collects, such as pupil dilation, fixation, and dwell times.  Through the company’s Human Factors and Advanced Engineering capability, these signals can then be developed into features that are useful to all aspects of the training process including such as measurements of stress and fatigue, situational awareness and cognitive workload.

A limitation identified during the Jericho engagement was that the capability is constrained by its central, fixed mounting position. While the field of view of the camera is wide and suitable for all aspects of visual and instrument flying, if the camera can no longer track a pilot’s eye(s), during dynamic air combat maneuvers, the system loses gaze tracking, this is a physical human limitation from a single sensor, but was agreed that further investigation of the requirement was required.

“If they’re flying tactically and, particularly basic fighter maneuvering, they’re spending a lot of time looking up and looking back, but the sensor is sitting right in front of them,” Nolan said. “So, all the system is really seeing then is the back or the side of their head. This is an opportunity to leverage our multi-camera solutions, but only if there is a clear value proposition to support a more complex capability.  Additionally, another strong use case is Crew Resource Management in a multi-crew side-by-side cockpit environment, where a wide field of view camera may be more suitable than two single-camera sensors, this solution is relevant to cockpit environments outside of single-seat fighter jets.”

Moving forward, Seeing Machines does not rule out extending its technology to other roles inherent to military aviation. With tests currently being done on air traffic controllers, the information collected in a console environment could eventually apply to other positions, such as maritime patrol aircraft, airborne early warning & control aircraft, refueling aircraft, or even an uncrewed aerial system (UAS). 

“We haven’t done a huge amount of work in the UAS space, but there’s no doubt there’s genuine interest in understanding that, given the workload and the stress associated with drone flying and the complexity of multiple crew in this environment.”