Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

  • Basil Wahn, Univ Osnabruck, University Osnabruck, Inst Cognit Sci
  • ,
  • Jessika Schwandt, McGill Univ, McGill University, Douglas Mental Hlth Inst, Integrated Program Neurosci
  • ,
  • Matti Krueger, Univ Osnabruck, University Osnabruck, Inst Cognit Sci
  • ,
  • Daina Crafa
  • Vanessa Nunnendorf, Univ Osnabruck, University Osnabruck, Inst Cognit Sci
  • ,
  • Peter Koenig, Univ Klinikum Hamburg Eppendorf, University of Hamburg, Inst Neurophysiol & Pathophysiol

In joint tasks, adjusting to the actions of others is critical for success. For joint visual search tasks, research has shown that when search partners visually receive information about each other's gaze, they use this information to adjust to each other's actions, resulting in faster search performance. The present study used a visual, a tactile and an auditory display, respectively, to provide search partners with information about each other's gaze. Results showed that search partners performed faster when the gaze information was received via a tactile or auditory display in comparison to receiving it via a visual display or receiving no gaze information. Findings demonstrate the effectiveness of tactile and auditory displays for receiving task-relevant information in joint tasks and are applicable to circumstances in which little or no visual information is available or the visual modality is already taxed with a demanding task such as air-traffic control.

Practitioner Summary: The present study demonstrates that tactile and auditory displays are effective for receiving information about actions of others in joint tasks. Findings are either applicable to circumstances in which little or no visual information is available or when the visual modality is already taxed with a demanding task.

Original languageEnglish
JournalErgonomics
Volume59
Issue6
Pages (from-to)781-795
Number of pages15
ISSN0014-0139
DOIs
Publication statusPublished - 2016
Externally publishedYes

    Research areas

  • joint action, visual search, tactile display, auditory display, sensory motor contingencies, multisensory processing, MODALITY ATTENTIONAL BLINKS, AUTOMATION COORDINATION, SENSORY AUGMENTATION, WITHIN-MODALITY, TARGET SEARCH, VISION, TASK, PERCEPTION, LOCALIZATION, RESOURCES

See relations at Aarhus University Citationformats

ID: 148592566