Aarhus Universitets segl

PencilNet: Zero-Shot Sim-to-Real Transfer Learning for Robust Gate Perception in Autonomous Drone Racing

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

Links

DOI

  • Xuan Huy Pham
  • Andriy Sarabakha, Nanyang Technological University, Technische Universität München
  • ,
  • Mykola Odnoshyvkin, Technische Universität München
  • ,
  • Erdal Kayacan
In autonomous and mobile robotics, one of the main challenges is the robust on-the-fly perception of the environment, which is often unknown and dynamic, like in autonomous drone racing. In this work, we propose a novel deep neural network-based perception method for racing gate detection – PencilNet 1 – which relies on a lightweight neural network backbone on top of a pencil filter. This approach unifies predictions of the gates' 2D position, distance, and orientation in a single pose tuple. We show that our method is effective for zero-shot sim-to-real transfer learning that does not need any real-world training samples. Moreover, our framework is highly robust to illumination changes commonly seen under rapid flight compared to state-of-art methods. A thorough set of experiments demonstrates the effectiveness of this approach in multiple challenging scenarios, where the drone completes various tracks under different lighting conditions.
OriginalsprogEngelsk
TidsskriftIEEE Robotics and Automation Letters
Vol/bind7
Nummer4
Sider (fra-til)11847 - 11854
Antal sider8
ISSN2377-3766
DOI
StatusUdgivet - okt. 2022

Se relationer på Aarhus Universitet Citationsformater

ID: 299887525