Eye-Hand Movement of Objects in Near Space Extended Reality

Publikation: Bidrag til bog/antologi/rapport/proceedingKonferencebidrag i proceedingsForskningpeer review

2 Citationer (Scopus)

Abstract

Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. In this work, we investigate the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We have implemented four techniques that explore two factors: the use of gaze only to move objects in X-Y vs. extra refinement by hand, and the use of hand input in the Z axis to directly move objects vs. indirectly via a transfer function. We compared all four techniques in a user study (N=24) against baselines of direct and indirect hand input. We detail user performance, effort and experience trade-offs and show that all eye-hand techniques significantly reduce physical effort over direct gestures, pointing toward effortless drag-and-drop for XR environments.

OriginalsprogEngelsk
TitelUIST 2024 - Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology
ForlagAssociation for Computing Machinery
Publikationsdatookt. 2024
Artikelnummer84
ISBN (Elektronisk)9798400706288
DOI
StatusUdgivet - okt. 2024
Begivenhed37th Annual ACM Symposium on User Interface Software and Technology, UIST 2024 - Pittsburgh, USA
Varighed: 13 okt. 202416 okt. 2024

Konference

Konference37th Annual ACM Symposium on User Interface Software and Technology, UIST 2024
Land/OmrådeUSA
ByPittsburgh
Periode13/10/202416/10/2024

Fingeraftryk

Dyk ned i forskningsemnerne om 'Eye-Hand Movement of Objects in Near Space Extended Reality'. Sammen danner de et unikt fingeraftryk.

Citationsformater