Aarhus University Seal / Aarhus Universitets segl

Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

Standard

Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People. / Mirzaei, Mohammadreza; Kán, Peter; Kaufmann, Hannes.

2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE, 2021. p. 582-587.

Research output: Contribution to book/anthology/report/proceedingArticle in proceedingsResearchpeer-review

Harvard

Mirzaei, M, Kán, P & Kaufmann, H 2021, Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People. in 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE, pp. 582-587, IEEE Annual International Symposium Virtual Reality, 27/03/2021. https://doi.org/10.1109/VR50410.2021.00083

APA

Mirzaei, M., Kán, P., & Kaufmann, H. (2021). Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR) (pp. 582-587). IEEE. https://doi.org/10.1109/VR50410.2021.00083

CBE

Mirzaei M, Kán P, Kaufmann H. 2021. Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE. pp. 582-587. https://doi.org/10.1109/VR50410.2021.00083

MLA

Mirzaei, Mohammadreza, Peter Kán and Hannes Kaufmann "Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People". 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE. 2021, 582-587. https://doi.org/10.1109/VR50410.2021.00083

Vancouver

Mirzaei M, Kán P, Kaufmann H. Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE. 2021. p. 582-587 https://doi.org/10.1109/VR50410.2021.00083

Author

Mirzaei, Mohammadreza ; Kán, Peter ; Kaufmann, Hannes. / Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People. 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE, 2021. pp. 582-587

Bibtex

@inproceedings{71c42645a512461685130628f485c377,
title = "Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People",
abstract = "This paper presents a novel method for the visualization of 3D spatial sounds in Virtual Reality (VR) for Deaf and Hard-of-Hearing (DHH) people. Our method enhances traditional VR devices with additional haptic and visual feedback, which aids spatial sound localization. The proposed system automatically analyses 3D sound from VR application, and it indicates the direction of sound sources to a user by two Vibro-motors and two Light-Emitting Diodes (LEDs). The benefit of automatic sound analysis is that our method can be used in any VR application without modifying the application itself. We evaluated the proposed method for 3D spatial sound visualization in a user study. Additionally, the conducted user study investigated which condition (corresponding to different senses) leads to faster performance in 3D sound localization task. For this purpose, we compared three conditions: haptic feedback only, LED feedback only, combined haptic and LED feedback. Our study results suggest that DHH participants could complete sound-related VR tasks significantly faster using LED and haptic+LED conditions in comparison to only haptic feedback. The presented method for spatial sound visualization can be directly used to enhance VR applications for use by DHH persons, and the results of our user study can serve as guidelines for the future design of accessible VR systems.",
author = "Mohammadreza Mirzaei and Peter K{\'a}n and Hannes Kaufmann",
year = "2021",
doi = "10.1109/VR50410.2021.00083",
language = "English",
isbn = "978-0-7381-2556-5",
pages = "582--587",
booktitle = "2021 IEEE Virtual Reality and 3D User Interfaces (VR)",
publisher = "IEEE",
note = "IEEE Annual International Symposium Virtual Reality ; Conference date: 27-03-2021 Through 03-04-2021",

}

RIS

TY - GEN

T1 - Head Up Visualization of Spatial Sound Sources in Virtual Reality for Deaf and Hard-of-Hearing People

AU - Mirzaei, Mohammadreza

AU - Kán, Peter

AU - Kaufmann, Hannes

PY - 2021

Y1 - 2021

N2 - This paper presents a novel method for the visualization of 3D spatial sounds in Virtual Reality (VR) for Deaf and Hard-of-Hearing (DHH) people. Our method enhances traditional VR devices with additional haptic and visual feedback, which aids spatial sound localization. The proposed system automatically analyses 3D sound from VR application, and it indicates the direction of sound sources to a user by two Vibro-motors and two Light-Emitting Diodes (LEDs). The benefit of automatic sound analysis is that our method can be used in any VR application without modifying the application itself. We evaluated the proposed method for 3D spatial sound visualization in a user study. Additionally, the conducted user study investigated which condition (corresponding to different senses) leads to faster performance in 3D sound localization task. For this purpose, we compared three conditions: haptic feedback only, LED feedback only, combined haptic and LED feedback. Our study results suggest that DHH participants could complete sound-related VR tasks significantly faster using LED and haptic+LED conditions in comparison to only haptic feedback. The presented method for spatial sound visualization can be directly used to enhance VR applications for use by DHH persons, and the results of our user study can serve as guidelines for the future design of accessible VR systems.

AB - This paper presents a novel method for the visualization of 3D spatial sounds in Virtual Reality (VR) for Deaf and Hard-of-Hearing (DHH) people. Our method enhances traditional VR devices with additional haptic and visual feedback, which aids spatial sound localization. The proposed system automatically analyses 3D sound from VR application, and it indicates the direction of sound sources to a user by two Vibro-motors and two Light-Emitting Diodes (LEDs). The benefit of automatic sound analysis is that our method can be used in any VR application without modifying the application itself. We evaluated the proposed method for 3D spatial sound visualization in a user study. Additionally, the conducted user study investigated which condition (corresponding to different senses) leads to faster performance in 3D sound localization task. For this purpose, we compared three conditions: haptic feedback only, LED feedback only, combined haptic and LED feedback. Our study results suggest that DHH participants could complete sound-related VR tasks significantly faster using LED and haptic+LED conditions in comparison to only haptic feedback. The presented method for spatial sound visualization can be directly used to enhance VR applications for use by DHH persons, and the results of our user study can serve as guidelines for the future design of accessible VR systems.

U2 - 10.1109/VR50410.2021.00083

DO - 10.1109/VR50410.2021.00083

M3 - Article in proceedings

SN - 978-0-7381-2556-5

SP - 582

EP - 587

BT - 2021 IEEE Virtual Reality and 3D User Interfaces (VR)

PB - IEEE

T2 - IEEE Annual International Symposium Virtual Reality

Y2 - 27 March 2021 through 3 April 2021

ER -