Aarhus University Seal

Event-based Navigation for Autonomous Drone Racing with Sparse Gated Recurrent Network

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

Event-based vision has already revolutionized the perception task for robots by promising faster response, lower energy consumption, and lower bandwidth without introducing motion blur. In this work, a novel deep learning method based on gated recurrent units utilizing sparse convolutions for detecting gates in a race track is proposed using event-based vision for the autonomous drone racing problem. We demonstrate the efficiency and efficacy of the perception pipeline on a real robot platform that can safely navigate a typical autonomous drone racing track in real-time. Throughout the experiments, we show that the event-based vision with the proposed gated recurrent unit and pretrained models on simulated event data significantly improve the gate detection precision. Furthermore, an event-based drone racing dataset11The code and data will be available at https://github.com/open-airlab/neuromorphic-au-drone-racing.git consisting of both simulated and real data sequences is publicly released.

Original languageEnglish
Title of host publication2022 European Control Conference (ECC)
Number of pages7
PublisherIEEE
Publication yearAug 2022
Pages1342-1348
ISBN (electronic)9783907144077
DOIs
Publication statusPublished - Aug 2022
Event2022 European Control Conference, ECC 2022 - London, United Kingdom
Duration: 12 Jul 202215 Jul 2022

Conference

Conference2022 European Control Conference, ECC 2022
LandUnited Kingdom
ByLondon
Periode12/07/202215/07/2022
Series2022 European Control Conference, ECC 2022

Bibliographical note

Publisher Copyright:
© 2022 EUCA.

See relations at Aarhus University Citationformats

ID: 295279091