The shape of musical sound: Real‐time visualizations of expressiveness in music performance.

2010 ◽  
Vol 127 (3) ◽  
pp. 1983-1983 ◽  
Author(s):  
Gang Ren ◽  
Justin Lundberg ◽  
Mark F. Bocko ◽  
Dave Headlam
2010 ◽  
Author(s):  
Ren Gang ◽  
Justin Lundberg ◽  
Mark Bocko ◽  
Dave Headlam

2018 ◽  
Vol 24 (3) ◽  
Author(s):  
Rolfe Inge Godøy

In recent decades, we have seen a surge in published work on embodied music cognition, and it is now broadly accepted that musical experience is intimately linked with experiences of body motion. It is also clear that music performance is not something abstract and without restrictions, but something traditionally (i.e., before the advent of electronic music) constrained by our possibilities for body motion. The focus of this paper is on these various constraints of sound-producing body motion that shape the emergent perceptual features of musical sound, as well as on how these constraints may enhance our understanding of agency in music perception.


2014 ◽  
Vol 38 (2) ◽  
pp. 51-62 ◽  
Author(s):  
Roger B. Dannenberg ◽  
Nicolas E. Gold ◽  
Dawen Liang ◽  
Guangyu Xia

Computers have the potential to significantly extend the practice of popular music based on steady tempo and mostly determined form. There are significant challenges to overcome, however, due to constraints including accurate timing based on beats and adherence to a form or structure despite possible changes that may occur, possibly even during performance. We describe an approach to synchronization across media that takes into account latency due to communication delays and audio buffering. We also address the problem of mapping from a conventional score with repeats and other structures to an actual performance, which can involve both “flattening” the score and rearranging it, as is common in popular music. Finally, we illustrate the possibilities of the score as a bidirectional user interface in a real-time system for music performance, allowing the user to direct the computer through a digitally displayed score, and allowing the computer to indicate score position back to human performers.


2021 ◽  
Vol 3 ◽  
Author(s):  
Florian Henkel ◽  
Gerhard Widmer

The task of real-time alignment between a music performance and the corresponding score (sheet music), also known as score following, poses a challenging multi-modal machine learning problem. Training a system that can solve this task robustly with live audio and real sheet music (i.e., scans or score images) requires precise ground truth alignments between audio and note-coordinate positions in the score sheet images. However, these kinds of annotations are difficult and costly to obtain, which is why research in this area mainly utilizes synthetic audio and sheet images to train and evaluate score following systems. In this work, we propose a method that does not solely rely on note alignments but is additionally capable of leveraging data with annotations of lower granularity, such as bar or score system alignments. This allows us to use a large collection of real-world piano performance recordings coarsely aligned to scanned score sheet images and, as a consequence, improve over current state-of-the-art approaches.


2021 ◽  
Author(s):  
Thibault Chabin ◽  
Damien Gabriel ◽  
Alexandre Comte ◽  
Emmanuel Haffen ◽  
Thierry Moulin ◽  
...  

AbstractOver the years, several publications have proposed that musical sound could be an ancestral emotional way of communication, thus positing an ancestral biological function for music. Understanding how musical emotions, and the pleasure derived from music regardless of the musical valence, can be shared between individuals is a fascinating question, and investigating it can shed light on the function of musical reward. Is the pleasure felt at the individual level transmitted on a collective level? And if so, how? We investigated these questions in a natural setting during an international competition for orchestra conductors. Participants (n=15) used a dedicated smartphone app to report their subjective emotional experiences in real time during a concert. We recorded participant’s electrodermal activity (EDA) and cerebral activity with electroencephalography (EEG). The overall behavioral real time ratings suggest a possible social influence on the reported and felt pleasure. The physically closer the participants, the more similar their reported pleasure. We estimated the inter-individual cerebral coherence, which indicates the degree of mutual cerebral information between pairs of participants in the frequency domain. The results show that when people simultaneously reported either high or low pleasure, their cerebral activities were closer than for simultaneous neutral pleasure reports. Participants’ skin conductance levels were also more coupled when reporting higher emotional degrees simultaneously. More importantly, the participants who were physically closer had higher cerebral coherence, but only when they simultaneously reported intense pleasure. We propose that mechanisms of emotional contagion and/or emotional resonance could explain why a form of ‘emotional connecting force’ could arise between people.


Sign in / Sign up

Export Citation Format

Share Document