Aarhus University Seal / Aarhus Universitets segl

GazeConduits: Calibration-Free Cross-Device Collaboration through Gaze and Touch

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

DOI

  • Simon Voelker, RWTH Aachen University
  • ,
  • Sebastian Hueber, RWTH Aachen University
  • ,
  • Christian Holz, Swiss Federal Institute of Technology Zurich
  • ,
  • Christian Remy
  • Nicolai Marquardt, University College London

We present GazeConduits, a calibration-free ad-hoc mobile interaction concept that enables users to collaboratively interact with tablets, other users, and content in a cross-device setting using gaze and touch input. GazeConduits leverages recently introduced smartphone capabilities to detect facial features and estimate users' gaze directions. To join a collaborative setting, users place one or more tablets onto a shared table and position their phone in the center, which then tracks users present as well as their gaze direction to determine the tablets they look at. We present a series of techniques using GazeConduits for collaborative interaction across mobile devices for content selection and manipulation. Our evaluation with 20 simultaneous tablets on a table shows that GazeConduits can reliably identify which tablet or collaborator a user is looking at.

Original languageEnglish
Title of host publicationCHI 2020 - Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
Number of pages10
Place of publicationNew York
PublisherAssociation for Computing Machinery
Publication year2020
Article number3376578
ISBN (Electronic)9781450367080
DOIs
Publication statusPublished - 2020
Event2020 ACM CHI Conference on Human Factors in Computing Systems, CHI 2020 - Honolulu, United States
Duration: 25 Apr 202030 Apr 2020

Conference

Conference2020 ACM CHI Conference on Human Factors in Computing Systems, CHI 2020
LandUnited States
ByHonolulu
Periode25/04/202030/04/2020
SponsorACM SIGCHI

    Research areas

  • cross-device interaction, gaze input, touch input

See relations at Aarhus University Citationformats

ID: 197679693