Aarhus University Seal

huSync - A model and system for the measure of synchronization in small groups: A case study on musical joint action

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Standard

huSync - A model and system for the measure of synchronization in small groups : A case study on musical joint action. / Sabharwal, Sanket Rajeev; Varlet, Manuel; Breaden, Matthew et al.

In: IEEE Access, Vol. 10, 2022, p. 92357-92372.

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Harvard

APA

CBE

MLA

Vancouver

Sabharwal SR, Varlet M, Breaden M, Volpe G, Camurri A, Keller PE. huSync - A model and system for the measure of synchronization in small groups: A case study on musical joint action. IEEE Access. 2022;10:92357-92372. doi: 10.1109/ACCESS.2022.3202959

Author

Sabharwal, Sanket Rajeev ; Varlet, Manuel ; Breaden, Matthew et al. / huSync - A model and system for the measure of synchronization in small groups : A case study on musical joint action. In: IEEE Access. 2022 ; Vol. 10. pp. 92357-92372.

Bibtex

@article{ade41c91cbfc4a9c8ab22dd3ce80f119,
title = "huSync - A model and system for the measure of synchronization in small groups: A case study on musical joint action",
abstract = "Human communication entails subtle non-verbal modes of expression, which can be analyzed quantitatively using computational approaches and thus support human sciences. In this paper we present huSync, a computational framework and system that utilizes trajectory information extracted using pose estimation algorithms from video sequences to quantify synchronization between individuals in small groups. The system is exploited to study interpersonal coordination in musical ensembles. Musicians communicate with each other through sounds and gestures, providing nonverbal cues that regulate interpersonal coordination. huSync was applied to recordings of concert performances by a professional instrumental ensemble playing two musical pieces. We examined effects of different aspects of musical structure (texture and phrase position) on interpersonal synchronization, which was quantified by computing phase locking values of head motion for all possible within-group pairs. Results indicate that interpersonal coupling was stronger for polyphonic textures (ambiguous leadership) than homophonic textures (clear melodic leader), and this difference was greater in early portions of phrases than endings (where coordination demands are highest). Results were cross-validated against an analysis of audio features, showing links between phase locking values and event density. This research produced a system, huSync, that can quantify synchronization in small groups and is sensitive to dynamic modulations of interpersonal coupling related to ambiguity in leadership and coordination demands, in standard video recordings of naturalistic human group interaction. huSync enabled a better understanding of the relationship between interpersonal coupling and musical structure, thus enhancing collaborations between human and computer scientists.",
keywords = "Behavioral sciences, Couplings, Entrainment, Interpersonal synchronization, Joint actions, Leadership, Music, Musical ensemble performance, Nonverbal communication, Pose estimation, Social factors, Social interaction, Social signal processing, Synchronization, Video recording",
author = "Sabharwal, {Sanket Rajeev} and Manuel Varlet and Matthew Breaden and Gualtiero Volpe and Antonio Camurri and Keller, {Peter E.}",
year = "2022",
doi = "10.1109/ACCESS.2022.3202959",
language = "English",
volume = "10",
pages = "92357--92372",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "IEEE",

}

RIS

TY - JOUR

T1 - huSync - A model and system for the measure of synchronization in small groups

T2 - A case study on musical joint action

AU - Sabharwal, Sanket Rajeev

AU - Varlet, Manuel

AU - Breaden, Matthew

AU - Volpe, Gualtiero

AU - Camurri, Antonio

AU - Keller, Peter E.

PY - 2022

Y1 - 2022

N2 - Human communication entails subtle non-verbal modes of expression, which can be analyzed quantitatively using computational approaches and thus support human sciences. In this paper we present huSync, a computational framework and system that utilizes trajectory information extracted using pose estimation algorithms from video sequences to quantify synchronization between individuals in small groups. The system is exploited to study interpersonal coordination in musical ensembles. Musicians communicate with each other through sounds and gestures, providing nonverbal cues that regulate interpersonal coordination. huSync was applied to recordings of concert performances by a professional instrumental ensemble playing two musical pieces. We examined effects of different aspects of musical structure (texture and phrase position) on interpersonal synchronization, which was quantified by computing phase locking values of head motion for all possible within-group pairs. Results indicate that interpersonal coupling was stronger for polyphonic textures (ambiguous leadership) than homophonic textures (clear melodic leader), and this difference was greater in early portions of phrases than endings (where coordination demands are highest). Results were cross-validated against an analysis of audio features, showing links between phase locking values and event density. This research produced a system, huSync, that can quantify synchronization in small groups and is sensitive to dynamic modulations of interpersonal coupling related to ambiguity in leadership and coordination demands, in standard video recordings of naturalistic human group interaction. huSync enabled a better understanding of the relationship between interpersonal coupling and musical structure, thus enhancing collaborations between human and computer scientists.

AB - Human communication entails subtle non-verbal modes of expression, which can be analyzed quantitatively using computational approaches and thus support human sciences. In this paper we present huSync, a computational framework and system that utilizes trajectory information extracted using pose estimation algorithms from video sequences to quantify synchronization between individuals in small groups. The system is exploited to study interpersonal coordination in musical ensembles. Musicians communicate with each other through sounds and gestures, providing nonverbal cues that regulate interpersonal coordination. huSync was applied to recordings of concert performances by a professional instrumental ensemble playing two musical pieces. We examined effects of different aspects of musical structure (texture and phrase position) on interpersonal synchronization, which was quantified by computing phase locking values of head motion for all possible within-group pairs. Results indicate that interpersonal coupling was stronger for polyphonic textures (ambiguous leadership) than homophonic textures (clear melodic leader), and this difference was greater in early portions of phrases than endings (where coordination demands are highest). Results were cross-validated against an analysis of audio features, showing links between phase locking values and event density. This research produced a system, huSync, that can quantify synchronization in small groups and is sensitive to dynamic modulations of interpersonal coupling related to ambiguity in leadership and coordination demands, in standard video recordings of naturalistic human group interaction. huSync enabled a better understanding of the relationship between interpersonal coupling and musical structure, thus enhancing collaborations between human and computer scientists.

KW - Behavioral sciences

KW - Couplings

KW - Entrainment

KW - Interpersonal synchronization

KW - Joint actions

KW - Leadership

KW - Music

KW - Musical ensemble performance

KW - Nonverbal communication

KW - Pose estimation

KW - Social factors

KW - Social interaction

KW - Social signal processing

KW - Synchronization

KW - Video recording

UR - http://www.scopus.com/inward/record.url?scp=85137585045&partnerID=8YFLogxK

U2 - 10.1109/ACCESS.2022.3202959

DO - 10.1109/ACCESS.2022.3202959

M3 - Journal article

AN - SCOPUS:85137585045

VL - 10

SP - 92357

EP - 92372

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -