Aarhus University Seal / Aarhus Universitets segl

Hans Gellersen

Gaze+RST: integrating Gaze and multitouch for remote Rotate-Scale-Translate tasks

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

DOI

Our work investigates the use of gaze and multitouch to fluidly perform rotate-scale-translate (RST) tasks on large displays. The work specifically aims to understand if gaze can provide benefit in such a task, how task complexity affects performance, and how gaze and multitouch can be combined to create an integral input structure suited to the task of RST. We present four techniques that individually strike a different balance between gaze-based and touch-based translation while maintaining concurrent rotation and scaling operations. A 16 participant empirical evaluation revealed that three of our four techniques present viable options for this scenario, and that larger distances and rotation/scaling operations can significantly affect a gaze-based translation configuration. Furthermore we uncover new insights regarding multimodal integrality, finding that gaze and touch can be combined into configurations that pertain to integral or separable input structures.
Original languageEnglish
Title of host publicationCHI '15 Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
Number of pages10
PublisherACM
Publication year2015
Pages4179-4188
ISBN (print)9781450331456
DOIs
Publication statusPublished - 2015
Externally publishedYes

See relations at Aarhus University Citationformats

ID: 205586867