Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation

  • Mingxin Yu Beijing Institute of Technology, China, and Northeastern University, USA
  • Yingzi Lin Northeastern University, USA
  • David Schmidt Northeastern University, USA
  • Xiangzhou Wang Beijing Institute of Technology, China
  • Yu Wang Beijing Institute of Technology, China
Keywords: human-robot interaction, teleoperation, gaze gestures, object selection, gaze-controlled interface


Teleoperation has been widely used to perform tasks in dangerous and unreachable environments by replacing humans with controlled agents. The idea of human-robot interaction (HRI) is very important in teleoperation. Conventional HRI input devices include keyboard, mouse and joystick, etc. However, they are not suitable for handicapped users or people with disabilities. These devices also increase the mental workload of normal users due to simultaneous operation of multiple HRI input devices by hand. Hence, HRI based on gaze tracking with an eye tracker is presented in this study. The selection of objects is of great importance and occurs at a high frequency during HRI control. This paper introduces gaze gestures as an object selection strategy into HRI for drone teleoperation. In order to test and validate the performance of gaze gestures selection strategy, we evaluate objective and subjective measurements, respectively. Drone control performance, including mean task completion time and mean error rate, are the objective measurements. The subjective measurement is the analysis of participant perception. The results showed gaze gestures selection strategy has a great potential as an additional HRI for use in agent teleoperation.
How to Cite
Yu, M., Lin, Y., Schmidt, D., Wang, X., & Wang, Y. (2014). Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation. Journal of Eye Movement Research, 7(4).

Most read articles by the same author(s)

Similar Articles

You may also start an advanced similarity search for this article.