Abstract
Eye tracking filters have been shown to improve accuracy of gaze estimation and input for stationary settings. However, their effectiveness during physical movement remains underexplored. In this work, we compare common online filters in the context of physical locomotion in extended reality and propose alterations to improve them for on-the-go settings. We conducted a computational experiment where we simulate performance of the online filters using data on participants attending visual targets located in world-, path-, and two head-based reference frames while standing, walking, and jogging. Our results provide insights into the filters' effectiveness and factors that affect it, such as the amount of noise caused by locomotion and differences in compensatory eye movements, and demonstrate that filters with saccade detection prove most useful for on-the-go settings. We discuss the implications of our findings and conclude with guidance on gaze data filtering for interaction in extended reality.
Original language | English |
---|---|
Journal | IEEE Transactions on Visualization and Computer Graphics |
Number of pages | 9 |
ISSN | 1077-2626 |
DOIs | |
Publication status | E-pub ahead of print - 10 Sept 2024 |
Keywords
- eye tracking
- gaze filters
- gaze-based pointing
- extended reality
- spatial reference frames
- physical locomotion