Behavior Matching in Multimodal Communication Is Synchronized
Article first published online: 17 SEP 2012
Copyright © 2012 Cognitive Science Society, Inc.
Volume 36, Issue 8, pages 1404–1426, November/December 2012
How to Cite
Louwerse, M. M., Dale, R., Bard, E. G. and Jeuniaux, P. (2012), Behavior Matching in Multimodal Communication Is Synchronized. Cognitive Science, 36: 1404–1426. doi: 10.1111/j.1551-6709.2012.01269.x
- Issue published online: 2 NOV 2012
- Article first published online: 17 SEP 2012
- Received 13 November 2011; received in revised form 21 February 2012; accepted 29 February 2012
- Behavior matching;
- Multimodal communication;
- Face-to-face conversation;
A variety of theoretical frameworks predict the resemblance of behaviors between two people engaged in communication, in the form of coordination, mimicry, or alignment. However, little is known about the time course of the behavior matching, even though there is evidence that dyads synchronize oscillatory motions (e.g., postural sway). This study examined the temporal structure of nonoscillatory actions—language, facial, and gestural behaviors—produced during a route communication task. The focus was the temporal relationship between matching behaviors in the interlocutors (e.g., facial behavior in one interlocutor vs. the same facial behavior in the other interlocutor). Cross-recurrence analysis revealed that within each category tested (language, facial, gestural), interlocutors synchronized matching behaviors, at temporal lags short enough to provide imitation of one interlocutor by the other, from one conversational turn to the next. Both social and cognitive variables predicted the degree of temporal organization. These findings suggest that the temporal structure of matching behaviors provides low-level and low-cost resources for human interaction.