Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality

Pavel Manakhov*, Ludwig Sidenmark, Ken Pfeuffer, Hans Gellersen

*Corresponding author for this work

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

171 Downloads (Pure)

Abstract

Eye tracking filters have been shown to improve accuracy of gaze estimation and input for stationary settings. However, their effectiveness during physical movement remains underexplored. In this work, we compare common online filters in the context of physical locomotion in extended reality and propose alterations to improve them for on-the-go settings. We conducted a computational experiment where we simulate performance of the online filters using data on participants attending visual targets located in world-, path-, and two head-based reference frames while standing, walking, and jogging. Our results provide insights into the filters' effectiveness and factors that affect it, such as the amount of noise caused by locomotion and differences in compensatory eye movements, and demonstrate that filters with saccade detection prove most useful for on-the-go settings. We discuss the implications of our findings and conclude with guidance on gaze data filtering for interaction in extended reality.
Original languageEnglish
JournalIEEE Transactions on Visualization and Computer Graphics
Volume30
Issue11
Pages (from-to)7234-7244
Number of pages11
ISSN1077-2626
DOIs
Publication statusPublished - Nov 2024

Keywords

  • extended reality
  • eye tracking
  • gaze filters
  • gaze-based pointing
  • physical locomotion
  • spatial reference frames
  • Eye tracking

Fingerprint

Dive into the research topics of 'Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended Reality'. Together they form a unique fingerprint.

Cite this