TY - GEN
T1 - Eyes Helping Hands
T2 - Gaze-Assisted Hand Interaction in Extended Reality
AU - Lystbæk, Mathias N.
PY - 2025/11/13
Y1 - 2025/11/13
N2 - When we interact with our environment, we use not only our hands but also our eyes. Our eyes play a major role in everyday interaction, locating areas of interest and guiding our hands before movement begins. Extended Reality (XR) promises a seamless blend of physical and digital worlds, visualising virtual components in our real or virtual environment. By having users wear head-mounted displays, we can leverage not just our hands in interaction, as with typical devices, but also our eyes. Leveraging both eyes and hands is particularly important as current hand-based methods can often result in challenging interactions, which can lead to physical fatigue, imprecision, and high cognitive load.
In this thesis, I investigate "Eyes Helping Hands" through three explorations of gaze assistance in XR interaction. In my first work, I conducted a comparative Fitts' Law study extended to 3D with depth, comparing two Gaze-Hand Alignment techniques to three baselines, including Gaze+Pinch and hand-only techniques. In my second work, I present "Hands-on, Hands-off" as a novel bimanual (two-handed) interaction framework, combining direct and indirect manipulation, investigated in an asymmetric task where the hands have different roles. In my third work, I developed Spatial Gaze Markers as a task-agnostic system that automatically places visual markers where users have looked based on attention shifts.
Through these investigations, I found that Gaze-Hand Alignment techniques exhibited comparable performance to Gaze+Pinch but degraded with depth due to parallax effects. Direct manipulation exhibited superior bimanual coordination through preemptive hand movement, which was lacking in indirect interaction. Finally, Spatial Gaze Markers significantly reduced task resumption costs after task-switching and distractions.
My contributions demonstrate that by designing interactions leveraging natural eye-hand coordination, we can create XR interactions that are familiar to users while being performant and ergonomic. This thesis opens new directions for attention-aware XR systems and establishes eye-hand coordination as a core design consideration for future XR applications.
AB - When we interact with our environment, we use not only our hands but also our eyes. Our eyes play a major role in everyday interaction, locating areas of interest and guiding our hands before movement begins. Extended Reality (XR) promises a seamless blend of physical and digital worlds, visualising virtual components in our real or virtual environment. By having users wear head-mounted displays, we can leverage not just our hands in interaction, as with typical devices, but also our eyes. Leveraging both eyes and hands is particularly important as current hand-based methods can often result in challenging interactions, which can lead to physical fatigue, imprecision, and high cognitive load.
In this thesis, I investigate "Eyes Helping Hands" through three explorations of gaze assistance in XR interaction. In my first work, I conducted a comparative Fitts' Law study extended to 3D with depth, comparing two Gaze-Hand Alignment techniques to three baselines, including Gaze+Pinch and hand-only techniques. In my second work, I present "Hands-on, Hands-off" as a novel bimanual (two-handed) interaction framework, combining direct and indirect manipulation, investigated in an asymmetric task where the hands have different roles. In my third work, I developed Spatial Gaze Markers as a task-agnostic system that automatically places visual markers where users have looked based on attention shifts.
Through these investigations, I found that Gaze-Hand Alignment techniques exhibited comparable performance to Gaze+Pinch but degraded with depth due to parallax effects. Direct manipulation exhibited superior bimanual coordination through preemptive hand movement, which was lacking in indirect interaction. Finally, Spatial Gaze Markers significantly reduced task resumption costs after task-switching and distractions.
My contributions demonstrate that by designing interactions leveraging natural eye-hand coordination, we can create XR interactions that are familiar to users while being performant and ergonomic. This thesis opens new directions for attention-aware XR systems and establishes eye-hand coordination as a core design consideration for future XR applications.
M3 - PhD thesis
T3 - PhD Theses - Department of Computer Science
ER -