Continual Learning for Robust Gate Detection under Dynamic Lighting in Autonomous Drone Racing

Zhongzheng Qiao, Xuan Huy Pham, Savitha Ramasamy, Xudong Jiang, Erdal Kayacan, Andriy Sarabakha

Publikation: Bidrag til bog/antologi/rapport/proceedingKonferencebidrag i proceedingsForskningpeer review

Abstract

In autonomous and mobile robotics, a principal challenge is resilient real-time environmental perception, particularly in situations characterized by unknown and dynamic elements, as exemplified in the context of autonomous drone racing. This study introduces a perception technique for detecting drone racing gates under illumination variations, which is common during high-speed drone flights. The proposed technique relies upon a lightweight neural network backbone augmented with capabilities for continual learning. The envisaged approach amalgamates predictions of the gates' positional coordinates, distance, and orientation, encapsulating them into a cohesive pose tuple. A comprehensive number of tests serve to underscore the efficacy of this approach in confronting diverse and challenging scenarios, specifically those involving variable lighting conditions. The proposed methodology exhibits notable robustness in the face of illumination variations, thereby substantiating its effectiveness.
OriginalsprogEngelsk
Titel2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
Antal sider8
ForlagIEEE
Publikationsdato2 maj 2024
ISBN (Elektronisk) 978-8-3503-5931-2
DOI
StatusUdgivet - 2 maj 2024
Begivenhed2024 International Joint Conference on Neural Networks, IJCNN 2024 - Yokohama, Japan
Varighed: 30 jun. 20245 jul. 2024

Konference

Konference2024 International Joint Conference on Neural Networks, IJCNN 2024
Land/OmrådeJapan
ByYokohama
Periode30/06/202405/07/2024

Fingeraftryk

Dyk ned i forskningsemnerne om 'Continual Learning for Robust Gate Detection under Dynamic Lighting in Autonomous Drone Racing'. Sammen danner de et unikt fingeraftryk.

Citationsformater