Aarhus University Seal

A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

Documents

DOI

Gaze-Hand Alignment has recently been proposed for multimodal selection in 3D. The technique takes advantage of gaze for target pre-selection, as it naturally precedes manual input. Selection is then completed when manual input aligns with gaze on the target, without need for an additional click method. In this work we evaluate two alignment techniques, Gaze&Finger and Gaze&Handray, combining gaze with image plane pointing versus raycasting, in comparison with hands-only baselines and Gaze&Pinch as established multimodal technique. We used Fitts' Law study design with targets presented at different depths in the visual scene, to assess effect of parallax on performance. The alignment techniques outperformed their respective hands-only baselines. Gaze&Finger is efficient when targets are close to the image plane but less performant with increasing target depth due to parallax.

Original languageEnglish
Title of host publicationCHI 2023 - Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
Publication year2023
Article number252
ISBN (Electronic)9781450394215
DOIs
Publication statusPublished - 2023

    Research areas

  • augmented reality, eye-tracking, gaze interaction, menu selection, mid-air gestures, pointing

See relations at Aarhus University Citationformats

Download statistics

No data available

ID: 312128744