Eye tracking for public displays in the wild

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

DOI

  • Yanxia Zhang, University Lancaster
  • ,
  • Ming Ki Chong, University Lancaster, Danmark
  • Jörg Müller
  • ,
  • Andreas Bulling, Max Planck Institute for Informatics
  • ,
  • Hans Gellersen, University Lancaster, Danmark

In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.

OriginalsprogEngelsk
TidsskriftPersonal and Ubiquitous Computing
Vol/bind19
Nummer5-6
Sider (fra-til)967-981
Antal sider15
ISSN1617-4909
DOI
StatusUdgivet - 3 jul. 2015

    Forskningsområder

  • Calibration-free, Deployment, Eye tracking, Gaze interaction, In-the-wild study, Public displays, Scrolling

Se relationer på Aarhus Universitet Citationsformater

ID: 93431469