Multiscale Performer–Audience Physical Synchrony in Joint Music Performance

Previous studies have shown that physical coordination among musicians characterizes their joint performances [1,2]. This report presents an analysis of the physical synchrony between performers and audiences, aiming to test the hypothesis that audiences' subtle physical activities are linked to those of performers, reflecting the way they experience different modes of performance. The experiment involved the Portorius string Quartet, which performed movements from Mozart and Haydn quartets as well as improvised pieces in different performance modes. Specifically for the repertoire works, they performed the same piece twice with two performance modes: (a) “strict” mode (strictly following every textual instruction while aiming for the best and most expressive performance), and (b) “let-go” mode (with a creative, improvisatory approach, allowing more risk-taking and spontaneous expressive gestures) [2]. The performance was attended by forty-two audience members. We measured the performers’ head motions with inertial measurement units (IMUs) attached on their forehead, and the audiences’ body fluctuations with IMUs contained in the smartphones that the audiences wore around their neck [3]. We evaluated physical synchrony of each musician-audience pair on the time-frequency space via the wavelet transform coherence (WTC) of their acceleration norm time series. By averaging the coherence values, we obtained the summary coherence measure of how much each member of audience was in sync with the musicians on average, at each time scale (1/frequency) for each piece. Repeated measures ANOVA comparing the let-go and strict modes at each time scale revealed that, while the performer-audience synchrony was higher for the strict than for the let-go mode in the shorter timescale (around 1 second; from musical point of view more strictly metronomical, often less expressive performance mode), in the longer timescale of the order of ten seconds the let-go mode synchrony dominates (from musical point of view allowing for more spontaneous expressive gestures). These results suggest that collective music experience is embodied in a multiscale adaptive dynamical interaction process between the performers and audiences and indicate that this process is sensitive to the degree of creativity involved in the music making. Furthermore, the results support the utility of physical sensing technologies in capturing subtle but important aspects of collective music experience, and potentially helping to improve it through feedback.
Acknowledgements
We thank the Guildhall School of Music and Drama for allowing us to use their facilities and for the help of their IT team support. This study was supported by JST-COI Grant Number of JPMJCE1309 and by KAKENHI Grant Number JP20H03553 from JSPS/MEXT, Japan.
References
[1] A. Chang, S.R. Livingstone, D.J. Bosnyak, L.J. Trainor, Proc Natl Acad Sci USA, 114 (2017) E4134-E4141.
[2] D. Dolan, H.J. Jensen, P.A.M. Mediano, M. Molina-Solana, H. Rajpal, F. Rosas, J.A. Sloboda, Front Psychol, 9 (2018) 1341.
[3] T. Nozawa, M. Uchiyama, K. Honda, T. Nakano, Y. Miyake, Sensors, 20 (2020) 2948.

Συνεδρία: 
Authors: 
Takayuki Nozawa, David Dolan, Fernando Rosas, Hardik Rajpal, Christopher Timmermann, Pedro Mediano, Sophia Prodanova, Nicole Petrus Barracks, Nathan Giorgetti, Diogo Ramos, Keigo Honda, Shunnichi Amano, Yoshihiro Miyake and Henrik Jensen
Room: 
5
Date: 
Friday, December 11, 2020 - 13:35 to 13:50

Partners

Twitter

Facebook

Contact

For information please contact :
ccs2020conf@gmail.com