Stereoscopic First Person View System for Drone Navigation

  • Nikolai Smolyanskyi ,
  • Mar Gonzalez Franco

Frontiers in Robotics and AI | , Vol 4(11)

Publication | Publication

Ground control of unmanned aerial vehicles (UAV) is a key to the advancement of this technology for commercial purposes. The need for reliable ground control arises in scenarios where human intervention is necessary, e.g. handover situations when autonomous systems fail. Manual flights are also needed for collecting diverse datasets to train deep neural network-based control systems. This axiom is even more prominent for the case of unmanned flying robots where there is no simple solution to capture optimal navigation footage. In such scenarios, improving the ground control and developing better autonomous systems are two sides of the same coin. To improve the ground control experience, and thus the quality of the footage, we propose to upgrade onboard teleoperation systems to a fully immersive setup that provides operators with a stereoscopic first person view (FPV) through a virtual reality (VR) head-mounted display. We tested users (n = 7) by asking them to fly our drone on the field. Test flights showed that operators flying our system can take off, fly, and land successfully while wearing VR headsets. In addition, we ran two experiments with prerecorded videos of the flights and walks to a wider set of participants (n = 69 and n = 20) to compare the proposed technology to the experience provided by current drone FPV solutions that only include monoscopic vision. Our immersive stereoscopic setup enables higher accuracy depth perception, which has clear implications for achieving better teleoperation and unmanned navigation. Our studies show comprehensive data on the impact of motion and simulator sickness in case of stereoscopic setup. We present the device specifications as well as the measures that improve teleoperation experience and reduce induced simulator sickness. Our approach provides higher perception fidelity during flights, which leads to a more precise better teleoperation and ultimately translates into better flight data for training deep UAV control policies.