Real-Time tracking and visual feedback offer interactive AR-Assisted capture systems as a convenient and low-cost alternative to specialized sensor rigs and robotic gantries. We present a simple strategy for decoupling localization and visual feedback in these applications from the primary sensor being used to capture the scene. Our strategy is to use an AR HMD and 6-DOF controller for tracking and feedback, synchronized with a separate primary sensor for capturing the scene. This approach allows for convenient real-Time localization of sensors that cannot do their own localization (e.g., microphones). In this poster paper, we present a prototype implementation of this strategy and investigate the accuracy of decoupled tracking by mounting a high resolution camera as the primary sensor, and comparing decoupled runtime pose estimates to the pose estimates of a high-resolution offline structure from motion.
|2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
|Udgivet - nov. 2020
|19th IEEE International Symposium on Mixed and Augmented Reality - Virtual conference
Varighed: 9 nov. 2020 → 13 nov. 2020
|19th IEEE International Symposium on Mixed and Augmented Reality
|09/11/2020 → 13/11/2020