Human Detection and Gesture Recognition for the Navigation of Unmanned Aircraft

Markus Lieret, Maximilian Hübner, Christian Hofmann, Jörg Franke

Research output: Contribution to journalConference articlepeer-review

Abstract

Unmanned aircraft (UA) have become increasingly popular for different industrial indoor applications in recent years. Typical applications include the automated stocktaking in high bay warehouses, the automated transport of materials or inspection tasks. Due to limited space in indoor environments and the ongoing production, the UA oftentimes need to operate in less distance to humans compared to outdoor applications. To reduce the risk of danger to persons present in the working area of the UA, it is necessary to enable the UA to perceive and locate persons and to react appropriately to their behaviour. Within this paper, we present an approach to influence the flight mission of autonomous UA using different gestures. Thereby, the UA detects persons within its flight path using an on-board camera and pauses its current flight mission. Subsequently, the body posture of the detected persons is determined so that the persons can provide further flight instructions to the UA via defined gestures. The proposed approach is evaluated by means of simulation and real world flight tests and shows an accuracy of the gesture recognition between 82 and 100 percent, depending on the distance between the persons and the UA.

Keywords

  • Computer Vision
  • Gesture Recognition
  • Indoor Navigation
  • Machine Learning
  • Unmanned Aircraft

Fingerprint

Dive into the research topics of 'Human Detection and Gesture Recognition for the Navigation of Unmanned Aircraft'. Together they form a unique fingerprint.

Cite this