A4 Refereed article in a conference publication
Vision-Based GNSS-Free Localization for UAVs in the Wild
Authors: Gurgu Marius-Mihail, Peña Queralta Jorge, Westerlund Tomi
Editors: N/A
Conference name: International Conference on Mechanical Engineering and Robotics Research
Publication year: 2023
Book title : International Conference on Mechanical Engineering and Robotics Research
Journal name in source: 2022 7TH INTERNATIONAL CONFERENCE ON MECHANICAL ENGINEERING AND ROBOTICS RESEARCH, ICMERR
First page : 7
Last page: 12
Number of pages: 6
ISBN: 978-1-6654-9052-8
eISBN: 978-1-6654-9051-1
DOI: https://doi.org/10.1109/ICMERR56497.2022.10097798
Web address : https://ieeexplore.ieee.org/document/10097798
Preprint address: https://arxiv.org/abs/2210.09727
Abstract
Considering the accelerated development of Unmanned Aerial Vehicles (UAVs) applications in both industrial and research scenarios, there is an increasing need for localizing these aerial systems in non-urban environments, using GNSS-Free, vision-based methods. Our paper proposes a vision-based localization algorithm that utilizes deep features to compute geographical coordinates of a UAV flying in the wild. The method is based on matching salient features of RGB photographs captured by the drone camera and sections of a pre-built map consisting of georeferenced open-source satellite images. Experimental results prove that vision-based localization has comparable accuracy with traditional GNSS-based methods, which serve as ground truth. Compared to state-of-the-art Visual Odometry (VO) approaches, our solution is designed for long-distance, high-altitude UAV flights. Code and datasets are available at https://github.com/TIERS/wildnav.
Considering the accelerated development of Unmanned Aerial Vehicles (UAVs) applications in both industrial and research scenarios, there is an increasing need for localizing these aerial systems in non-urban environments, using GNSS-Free, vision-based methods. Our paper proposes a vision-based localization algorithm that utilizes deep features to compute geographical coordinates of a UAV flying in the wild. The method is based on matching salient features of RGB photographs captured by the drone camera and sections of a pre-built map consisting of georeferenced open-source satellite images. Experimental results prove that vision-based localization has comparable accuracy with traditional GNSS-based methods, which serve as ground truth. Compared to state-of-the-art Visual Odometry (VO) approaches, our solution is designed for long-distance, high-altitude UAV flights. Code and datasets are available at https://github.com/TIERS/wildnav.