How does synchronization facilitate social interactions?

Taking turns during conversations may help coordinate verbal and nonverbal cues.

Follow us onFollow Tech Explorist on Google News

Turn-taking dynamics of social interactions are important for speech and gesture synchronization, enabling conversations to proceed efficiently, according to a study published September 25, 2024, in the open-access journal PLOS ONE by Tifenn Fauviaux from the University of Montpellier, France, and colleagues.

Conversations encompass continuous exchanges of verbal and nonverbal information. Previous research has demonstrated that gestures and speech synchronize at the individual level. But few studies have investigated how this phenomenon may unfold between individuals.

To fill this knowledge gap, Fauviaux and colleagues used an online dataset consisting of 14 sessions of two people engaged in unstructured face-to-face conversations during which they were free to talk about specific topics. Each of these sessions contained between one and four discussions, and the conversations lasted from 7 to 15 minutes.

The researchers analyzed both audio and motion data, and measured speech and gesture synchronization at different timescales. Specifically, they focused on vocal properties through the speech amplitude envelope and movement properties through head and wrist gestures.

Schematic diagram of the relationship between intrapersonal and interpersonal synchronization
Schematic diagram of the relationship between intrapersonal and interpersonal synchronization, whether unimodal or multimodal. A) and B) represent intrapersonal synchronization among the modalities of a single speaker. A) Full blue arrows highlight the unimodal relationship between the gestures generated by a single individual (i.e., head vs. head, head vs. wrist). B) Dashed blue arrows highlight the multimodal relationship between the voice and the gesture produced by a single individual (i.e., head vs. voice; wrist vs. voice). C) and D) represent the interpersonal synchronization between the modalities of speaker A and speaker B. C) Full red arrows highlight the unimodal relationships between the movements of speaker A and speaker B, and between their voices (i.e., head vs. head, head vs. wrist, voice vs. voice). D) Dashed red arrows highlight the multimodal relationships between the voice of speaker A and the gesture of speaker B and inversely (i.e., head vs. voice; wrist vs. voice). Image Credit: Fauviaux et al., 2024, PLOS ONE

The results supported previous research on speech and gesture coordination at the individual level, revealing synchronization at all timescales of the conversation. That is, there was higher-than-chance synchronization between a given participant’s wrist and head movements, and similar synchronization between these movements and vocal properties.

Extending the literature, the researchers also found that gestures and speech synchronize between individuals. In other words, there was coordination between the voices and the bodies of the two speakers. Taken together, the findings suggest that this type of synchronization of verbal and nonverbal information likely depends on the turn-taking dynamics of conversations.

According to the authors, the study enriches our understanding of behavioral dynamics during social interactions at both the intrapersonal and interpersonal levels, and strengthens knowledge regarding the importance of synchrony between speech and gestures. Future research building on this study could shed light on prosocial behaviors and psychiatric conditions characterized by social deficits.

The authors add: “How do my speech and behaviors influence, or respond to, the speech and behaviors of the person I’m conversing with? This study answers this question by investigating the multimodal dynamic between speech and movements, both at the individual’s level and the dyadic level. Our findings confirm intrapersonal coordination between speech and gestures across all temporal scales. It also suggests that multimodal and interpersonal synchronization may be influenced by the speech channel, particularly the dynamics of turn-taking.”

Journal Reference:

  1. Fauviaux T, Marin L, Parisi M, Schmidt R, Mostafaoui G (2024) From unimodal to multimodal dynamics of verbal and nonverbal cues during unstructured conversation. PLoS ONE 19(9): e0309831. DOI: 10.1371/journal.pone.0309831
Up next

Evolutionary map uncovered bacterial survival genes

The most detailed study on how Staphylococcus aureus adapts to life on the human body.

A big mystery in cancer research solved

Breakthrough in understanding how tumor cells die after radiotherapy.
Recommended Books
The Cambridge Handbook of the Law, Policy, and Regulation for Human–Robot Interaction (Cambridge Law Handbooks)

The Cambridge Handbook of the Law, Policy, and Regulation for Human-Robot...

Book By
Cambridge University Press
Journal
Picks for you

Ants vs. humans: Why collaboration works better for them

UniSA research delivers the best moves to reduce dementia risk

Bay Area soda taxes help change people’s minds

Study finds reasons owners choose certain diets for their dogs

Sleep helps the brain to store and learn a new language