A Real-Time Method for Time-to-Collision Estimation from Aerial Images

Daniel Tøttrup, Stinus Lykke Skovgaard, Jonas Le Fevre Sejersen, Rui Pimentel de Figueiredo*

*Corresponding author for this work

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Abstract

Large vessels such as container ships rely on experienced pilots with extensive knowledge of the local streams and tides responsible for maneuvering the vessel to its desired location. This work proposes estimating time-to-collision (TTC) between moving objects (i.e., vessels) using real-time video data captured from aerial drones in dynamic maritime environments. Our deep-learning-based methods utilize features optimized with realistic virtually generated data for reliable and robust object detection, segmentation, and tracking. Furthermore, we use rotated bounding box representations, obtained from fine semantic segmentation of objects, for enhanced TTC estimation accuracy. We intuitively present collision estimates as collision arrows that gradually change color to red to indicate an imminent collision. Experiments conducted in a realistic dockyard virtual environment show that our approaches precisely, robustly, and efficiently predict TTC between dynamic objects seen from a top-view, with a mean error and a standard deviation of 0.358 and 0.114 s, respectively, in a worst-case scenario.

Original languageEnglish
Article number62
JournalJournal of Imaging
Volume8
Issue3
ISSN2313-433X
DOIs
Publication statusPublished - Mar 2022

Keywords

  • Convolutional neural networks
  • Multiple-object tracking
  • Time-to-collision estimation

Fingerprint

Dive into the research topics of 'A Real-Time Method for Time-to-Collision Estimation from Aerial Images'. Together they form a unique fingerprint.

Cite this