Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation
Abstract
Teleoperation has been widely used to perform tasks in dangerous and unreachable environments by replacing humans with controlled agents. The idea of human-robot interaction (HRI) is very important in teleoperation. Conventional HRI input devices include keyboard, mouse and joystick, etc. However, they are not suitable for handicapped users or people with disabilities. These devices also increase the mental workload of normal users due to simultaneous operation of multiple HRI input devices by hand. Hence, HRI based on gaze tracking with an eye tracker is presented in this study. The selection of objects is of great importance and occurs at a high frequency during HRI control. This paper introduces gaze gestures as an object selection strategy into HRI for drone teleoperation. In order to test and validate the performance of gaze gestures selection strategy, we evaluate objective and subjective measurements, respectively. Drone control performance, including mean task completion time and mean error rate, are the objective measurements. The subjective measurement is the analysis of participant perception. The results showed gaze gestures selection strategy has a great potential as an additional HRI for use in agent teleoperation.
Published
2014-09-29
How to Cite
Yu, M., Lin, Y., Schmidt, D., Wang, X., & Wang, Y. (2014). Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation. Journal of Eye Movement Research, 7(4). https://doi.org/10.16910/jemr.7.4.4
Issue
Section
Articles
License
Copyright (c) 2014 Mingxin Yu, Yingzi Lin, David Schmidt, Xiangzhou Wang, Yu Wang
This work is licensed under a Creative Commons Attribution 4.0 International License.