“Quantifying the dynamics of multimodal communication with multimodal data.”
*Presented by the Center for Social Statistics
Abstract: Human communication is built upon an array of signals, from body movement to word selection. The sciences of language and communication tend to study these signals individually. However, natural human communication uses all these signals together simultaneously, and in complex social systems of various sizes. It is an open puzzle to uncover how this multimodal communication is structured in time and organized at different scales. Such a puzzle includes analysis of two-person interactions. It also involves an understanding of much larger systems, such as communication over social media at an unprecedentedly massive scale.
Collaborators and I have explored communication across both of these scales, and I will describe examples in the domain of conflict. For example, we’ve studied conflict communication in two-person interactions using video analysis of body and voice dynamics. At the broader scale, we have also used large-scale social media behavior (Twitter) during a massively shared experience of conflict, the 2012 Presidential Debates. These projects reveal the importance of dynamics. In two-person conflict, for example, signal dynamics (e.g., body, voice) during interaction can reveal the quality of that interaction. In addition, collective behavior on Twitter can be predicted even by simple linear models using debate dynamics between Obama and Romney (e.g., one interrupting the other).
The collection, quantification, and modeling of multitemporal and multivariate datasets hold much promise for new kinds of interdisciplinary collaborations. I will end by discussing how they may guide new theoretical directions for pursuing the organization and temporal structure of multimodality in communication.
Url: http://statistics.ucla.edu/seminars/2016-03-31/2:30pm/314-royce-hall