A1 Refereed original research article in a scientific journal
Vision-based safe autonomous UAV docking with panoramic sensors
Authors: Nguyen Phuoc Thuan, Westerlund Tomi, Peña Queralta Jorge
Publisher: Frontiers media
Publishing place: Lausanne
Publication year: 2023
Journal: Frontiers in Robotics and AI
Journal name in source: FRONTIERS IN ROBOTICS AND AI
Article number: 1223157
Volume: 10
ISSN: 2296-9144
DOI: https://doi.org/10.3389/frobt.2023.1223157
Web address : https://www.frontiersin.org/articles/10.3389/frobt.2023.1223157/full
Self-archived copy’s web address: https://research.utu.fi/converis/portal/detail/Publication/380536271
The remarkable growth of unmanned aerial vehicles (UAVs) has also sparked concerns about safety measures during their missions. To advance towards safer autonomous aerial robots, this work presents a vision-based solution to ensuring safe autonomous UAV landings with minimal infrastructure. During docking maneuvers, UAVs pose a hazard to people in the vicinity. In this paper, we propose the use of a single omnidirectional panoramic camera pointing upwards from a landing pad to detect and estimate the position of people around the landing area. The images are processed in real-time in an embedded computer, which communicates with the onboard computer of approaching UAVs to transition between landing, hovering or emergency landing states. While landing, the ground camera also aids in finding an optimal position, which can be required in case of low-battery or when hovering is no longer possible. We use a YOLOv7-based object detection model and a XGBooxt model for localizing nearby people, and the open-source ROS and PX4 frameworks for communication, interfacing, and control of the UAV. We present both simulation and real-world indoor experimental results to show the efficiency of our methods.
Downloadable publication This is an electronic reprint of the original article. |