HeadShift: Head Pointing with Dynamic Control-Display Gain

Haopeng Wang*, Ludwig Sidenmark, Florian Weidner, Joshua Newn, Hans Gellersen

*Corresponding author for this work

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

1 Citation (Scopus)

Abstract

Head pointing is widely used for hands-free input in head-mounted displays (HMDs). The primary role of head movement in an HMD is to control the viewport based on absolute mapping of head rotation to the 3D environment. Head pointing is conventionally supported by the same 1:1 mapping of input with a cursor fixed in the centre of the view, but this requires exaggerated head movement and limits input granularity. In this work, we propose to adopt dynamic gain to improve ergonomics and precision and introduce the HeadShift technique. The design of HeadShift is grounded in natural eye-head coordination to manage control of the viewport and the cursor at different speeds. We evaluated HeadShift in a Fitts' Law experiment and on three different applications in VR, finding the technique to reduce error rate and effort. The findings are significant as they show that gain can be adopted effectively for head pointing while ensuring that the cursor is maintained within a comfortable eye-in-head viewing range.

Original languageEnglish
Article number2
JournalACM Transactions on Computer-Human Interaction
Volume32
Issue1
Number of pages28
ISSN1073-0516
DOIs
Publication statusPublished - 19 Apr 2025

Keywords

  • Control-Display Gain
  • Head Mounted Display
  • Pointing
  • Virtual Reality

Fingerprint

Dive into the research topics of 'HeadShift: Head Pointing with Dynamic Control-Display Gain'. Together they form a unique fingerprint.

Cite this