Assessment of UAV operator workload in a reconfigurable multi-touch ground control station environment
Multi-touch computer inputs allow users to interact with a virtual environment through the use of gesture commands on a monitor instead of a mouse and keyboard. This style of input is easy for the human mind to adapt to because gestures directly reflect how one interacts with the natural environment. This paper presents and assesses a personal-computer-based unmanned aerial vehicle ground control station that utilizes multi-touch gesture inputs and system reconfigurability to enhance operator performance. The system was developed at Ryerson University’s Mixed-Reality Immersive Motion Simulation Laboratory using commercial-off-the-shelf Presagis software. The ground control station was then evaluated using NASA’s task load index to determine if the inclusion of multi-touch gestures and reconfigurability provided an improvement in operator workload over the more traditional style of mouse and keyboard inputs. To conduct this assessment, participants were tasked with flying a simulated aircraft through a specified number of waypoints, and had to utilize a payload controller within a predetermined area. The task load index results from these flight tests have initially shown that the developed touch-capable ground control station improved operator workload while reducing the impact of all six related human factors.