Abstract
We present GazeConduits, a calibration-free ad-hoc mobile interaction concept that enables users to collaboratively interact with tablets, other users, and content in a cross-device setting using gaze and touch input. GazeConduits leverages recently introduced smartphone capabilities to detect facial features and estimate users' gaze directions. To join a collaborative setting, users place one or more tablets onto a shared table and position their phone in the center, which then tracks users present as well as their gaze direction to determine the tablets they look at. We present a series of techniques using GazeConduits for collaborative interaction across mobile devices for content selection and manipulation. Our evaluation with 20 simultaneous tablets on a table shows that GazeConduits can reliably identify which tablet or collaborator a user is looking at.
Originalsprog | Engelsk |
---|---|
Titel | CHI 2020 - Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems |
Antal sider | 10 |
Udgivelsessted | New York |
Forlag | Association for Computing Machinery |
Publikationsdato | 2020 |
Artikelnummer | 3376578 |
ISBN (Elektronisk) | 9781450367080 |
DOI | |
Status | Udgivet - 2020 |
Begivenhed | 2020 ACM CHI Conference on Human Factors in Computing Systems, CHI 2020 - Honolulu, USA Varighed: 25 apr. 2020 → 30 apr. 2020 |
Konference
Konference | 2020 ACM CHI Conference on Human Factors in Computing Systems, CHI 2020 |
---|---|
Land/Område | USA |
By | Honolulu |
Periode | 25/04/2020 → 30/04/2020 |