A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

248 Downloads (Pure)

Abstract

Gaze-Hand Alignment has recently been proposed for multimodal selection in 3D. The technique takes advantage of gaze for target pre-selection, as it naturally precedes manual input. Selection is then completed when manual input aligns with gaze on the target, without need for an additional click method. In this work we evaluate two alignment techniques, Gaze&Finger and Gaze&Handray, combining gaze with image plane pointing versus raycasting, in comparison with hands-only baselines and Gaze&Pinch as established multimodal technique. We used Fitts' Law study design with targets presented at different depths in the visual scene, to assess effect of parallax on performance. The alignment techniques outperformed their respective hands-only baselines. Gaze&Finger is efficient when targets are close to the image plane but less performant with increasing target depth due to parallax.

Original languageEnglish
Title of host publicationCHI '23 : Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
EditorsAlbrecht Schmidt, Kaisa Väänänen, Tesh Goyal, Per Ola Kristensson, Anicia Peters, Stefanie Mueller, Julie R. Williamson, Max L. Wilson
Place of publicationNew York
PublisherAssociation for Computing Machinery
Publication dateApr 2023
Article number252
ISBN (Electronic)9781450394215
DOIs
Publication statusPublished - Apr 2023

Keywords

  • augmented reality
  • eye-tracking
  • gaze interaction
  • menu selection
  • mid-air gestures
  • pointing

Fingerprint

Dive into the research topics of 'A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces'. Together they form a unique fingerprint.

Cite this