scholarly journals Does Movement Amplitude of a Co-performer Affect Individual Performance in Musical Synchronization?

2021 ◽  
Vol 4 ◽  
pp. 205920432110317
Author(s):  
Ian D. Colley ◽  
Manuel Varlet ◽  
Jennifer MacRitchie ◽  
Peter E. Keller

Interpersonal coordination in musical ensembles often involves multisensory cues, with visual information about body movements supplementing co-performers’ sounds. Previous research on the influence of movement amplitude of a visual stimulus on basic sensorimotor synchronization has shown mixed results. Uninstructed visuomotor synchronization seems to be influenced by amplitude of a visual stimulus, but instructed visuomotor synchronization is not. While music performance presents a special case of visually mediated coordination, involving both uninstructed (spontaneously coordinating ancillary body movements with co-performers) and instructed (producing sound on a beat) forms of synchronization, the underlying mechanisms might also support rhythmic interpersonal coordination in the general population. We asked whether visual cue amplitude would affect nonmusicians’ synchronization of sound and head movements in a musical drumming task designed to be accessible regardless of musical experience. Given the mixed prior results, we considered two competing hypotheses. H1: higher amplitude visual cues will improve synchronization. H2: different amplitude visual cues will have no effect on synchronization. Participants observed a human-derived motion capture avatar with three levels of movement amplitude, or a still image of the avatar, while drumming along to the beat of tempo-changing music. The moving avatars were always timed to match the music. We measured temporal asynchrony (drumming relative to the music), predictive timing, ancillary movement fluctuation, and cross-spectral coherence of ancillary movements between the participant and avatar. The competing hypotheses were tested using conditional equivalence testing. This method involves using a statistical equivalence test in the event that standard hypothesis tests show no differences. Our results showed no statistical differences across visual cues types. Therefore, we conclude that there is not a strong effect of visual stimulus amplitude on instructed synchronization.

2017 ◽  
Vol 114 (21) ◽  
pp. E4134-E4141 ◽  
Author(s):  
Andrew Chang ◽  
Steven R. Livingstone ◽  
Dan J. Bosnyak ◽  
Laurel J. Trainor

The cultural and technological achievements of the human species depend on complex social interactions. Nonverbal interpersonal coordination, or joint action, is a crucial element of social interaction, but the dynamics of nonverbal information flow among people are not well understood. We used joint music making in string quartets, a complex, naturalistic nonverbal behavior, as a model system. Using motion capture, we recorded body sway simultaneously in four musicians, which reflected real-time interpersonal information sharing. We used Granger causality to analyze predictive relationships among the motion time series of the players to determine the magnitude and direction of information flow among the players. We experimentally manipulated which musician was the leader (followers were not informed who was leading) and whether they could see each other, to investigate how these variables affect information flow. We found that assigned leaders exerted significantly greater influence on others and were less influenced by others compared with followers. This effect was present, whether or not they could see each other, but was enhanced with visual information, indicating that visual as well as auditory information is used in musical coordination. Importantly, performers’ ratings of the “goodness” of their performances were positively correlated with the overall degree of body sway coupling, indicating that communication through body sway reflects perceived performance success. These results confirm that information sharing in a nonverbal joint action task occurs through both auditory and visual cues and that the dynamics of information flow are affected by changing group relationships.


2018 ◽  
Author(s):  
Huriye Atilgan ◽  
Jennifer K. Bizley

AbstractThe ability to use temporal relationships between cross-modal cues facilitates perception and behavior. Previously we observed that temporally correlated changes in the size of a visual stimulus and the intensity in an auditory stimulus influenced the ability of listeners to perform an auditory selective attention task (Maddox et al., 2015). In this task participants detected timbral changes in a target sound while ignoring those in a simultaneously presented masker. When the visual stimulus was temporally coherent with the target sound, performance was significantly better than when it was temporally coherent with the masker sound, despite the visual stimulus conveying no task-relevant information. Here, we trained observers to detect audiovisual temporal coherence and asked whether this improved their ability to benefit from visual cues during the auditory selective attention task. We observed these listeners improved performance in the auditory selective attention task and changed the way in which they benefited from a visual stimulus: after training performance was better when the visual stimulus was temporally coherent with either the target or the masker stream, relative to the condition in which the visual stimulus was coherent with neither auditory stream. A second group which trained to discriminate modulation rate differences between temporally coherent audiovisual streams improved task performance, but did not change the way in which they used visual information. A control group did not change their performance between pretest and post-test. These results provide insights into how crossmodal experience may optimize multisensory integration.


2011 ◽  
Vol 15 (3) ◽  
pp. 324-342 ◽  
Author(s):  
Klaus-Ernst Behne ◽  
Clemens Wöllner

The visual impact of musicians’ body movements has increasingly attracted research interest over the past twenty years. This article gives an overview of the main findings of this research and introduces and replicates one of the first experiments on visual information in music performance evaluations. In Behne’s study (originally published in German in 1990), a pianist was video-recorded performing compositions by Brahms and Chopin. Using an audiovisual manipulation paradigm, further pianists acted as doubles and pretended to perform the music to the soundtrack of the first pianist. Different groups of ninety-three musicians and non-musicians rated audiovisual presentations of the videos. Only one participant in the whole series of experiments supposed that the musical soundtrack was similar across different performers. Even musically trained participants strongly believed that they perceived differences between performances. Further findings suggest gender effects, such that male interpretations were perceived to be more precise and female interpretations to be more dramatic. The replication generally confirmed the results for a present-day audience. Potential consequences for music evaluations and theories of audiovisual music perception are discussed.


2019 ◽  
Vol 72 (9) ◽  
pp. 2272-2287 ◽  
Author(s):  
Caroline Palmer ◽  
Frances Spidle ◽  
Erik Koopmans ◽  
Peter Schubert

We examined the relationship between endogenous rhythms, auditory and visual cues, and body movement in the temporal coordination of duet singers. Sixteen pairs of experienced vocalists sang a familiar melody in Solo and two Duet conditions. Vocalists sang together in Unison (simultaneously producing identical pitches) and Round Duet conditions (one vocalist, the Follower, producing pitches at an eight-tone delay from their partner, the Leader) while facing Inward (full visual cues) and Outward (reduced visual cues). Larger tempo differences in partners’ spontaneous (temporally unconstrained) Solo performances were associated with larger asynchrony in Duet performances, consistent with coupling predictions for oscillators with similar natural frequencies. Vocalists were slightly but consistently more synchronous in Duets when facing their partner (Inward) than when facing Outward; Unison and Round performances were equally synchronous. The greater difficulty of Rounds production was evidenced in vocalists’ slower performance rates and more variable head movements; Followers directed their head gaze away from their partner and used bobbing head movements to mark the musical beat. The strength of Followers’ head movements corresponded to the amount of tone onset asynchrony with their partners, indicating a strong association between timing and movement under increased attentional and working memory demands in music performance.


2020 ◽  
Vol 117 (37) ◽  
pp. 23085-23095 ◽  
Author(s):  
Benjamin Cellini ◽  
Jean-Michel Mongeau

Animals use active sensing to respond to sensory inputs and guide future motor decisions. In flight, flies generate a pattern of head and body movements to stabilize gaze. How the brain relays visual information to control head and body movements and how active head movements influence downstream motor control remains elusive. Using a control theoretic framework, we studied the optomotor gaze stabilization reflex in tethered flight and quantified how head movements stabilize visual motion and shape wing steering efforts in fruit flies (Drosophila). By shaping visual inputs, head movements increased the gain of wing steering responses and coordination between stimulus and wings, pointing to a tight coupling between head and wing movements. Head movements followed the visual stimulus in as little as 10 ms—a delay similar to the human vestibulo-ocular reflex—whereas wing steering responses lagged by more than 40 ms. This timing difference suggests a temporal order in the flow of visual information such that the head filters visual information eliciting downstream wing steering responses. Head fixation significantly decreased the mechanical power generated by the flight motor by reducing wingbeat frequency and overall thrust. By simulating an elementary motion detector array, we show that head movements shift the effective visual input dynamic range onto the sensitivity optimum of the motion vision pathway. Taken together, our results reveal a transformative influence of active vision on flight motor responses in flies. Our work provides a framework for understanding how to coordinate moving sensors on a moving body.


2020 ◽  
Author(s):  
Karl K. Kopiske ◽  
Daniel Koska ◽  
Thomas Baumann ◽  
Christian Maiwald ◽  
Wolfgang Einhäuser

Most humans can walk effortlessly across uniform terrain even without paying much attention to it. However, most natural terrain is far from uniform, and we need visual information to maintain stable gait. In a controlled yet naturalistic environment, we simulated terrain difficulty through slip-like perturbations that were either unpredictable (experiment 1) or sometimes followed visual cues (experiment 2) while recording eye and body movements using mobile eye tracking and full-body motion tracking. We quantified the distinct roles of eye and head movements for adjusting gaze on different time scales. While motor perturbations mainly influenced head movements, eye movements were primarily affected by visual cues, both immediately following slips, and – to a lesser extent – over 5-minute blocks. We find adapted gaze parameters already after the first perturbation in each block, with little transfer between blocks. In conclusion, gaze-gait interactions in experimentally perturbed naturalistic walking are adaptive, flexible, and effector-specific.


Behaviour ◽  
1979 ◽  
Vol 70 (1-2) ◽  
pp. 1-116 ◽  
Author(s):  
I. Bossema

AbstractThe European jay (Garrulus g. glandarius) strongly depends on acorns for food. Many acorns are hoarded enabling the jay to feed upon them at times of the year in which they would otherwise be unavailable. Many of the hoarded acorns germinate and become seedlings so that jays play an important role in the dispersal of acorns and the reproduction of oaks (in this study: Quercus robur, the pedunculate oak). These mutual relationships were analysed both with wild jays in the field (province of Drente, The Netherlands) and with tame birds in confinement. Variation in the composition of the food throughout the year is described quantitatively. Acorns were the stock diet of adults in most months of the year. Leaf-eating caterpillars predominantly occurring on oak were the main food items of nestlings. Acorns formed the bulk of the food of fledglings in June. A high rate of acorn consumption in winter, spring and early summer becomes possible because individual jays hoard several thousands of acorns, mainly in October. In experiments, acorns of pedunculate oak were not preferred over equal sized acorns of sessile oak (which was not found in the study area). Acorns of pedunculate oak were strongly preferred over those of American oak and nuts of hazel and beech. Among acorns of pedunculate oak, ripe, sound, long-slim and big ones were preferred. Jays collect one or more (up to six) acorns per hoarding trip. In the latter case, the first ones are swallowed and the last one is usually carried in the bill. For swallowing the dimensions of the beak imposed a limit on size preference; for bill transport usually the biggest acorn was selected. The greater the number of acorns per trip, the longer was the transportation distance during hoarding. From trip to trip jays dispersed their acorns widely and when several acorns were transported during one trip, these were generally buried at different sites. Burial took place by pushing acorns in the soil and by subsequent hammering and covering. Jays often selected rather open sites, transitions in the vegetation and vertical structures such as saplings and tree trunks, for burial of acorns. In captivity jays also hoarded surplus food. Here, spacing out of burials was also observed; previously used sites usually being avoided. In addition, hiding along substrate edges and near conspicuous objects was observed. Jays tended to hide near sticks presented in a horizontal position rather than near identical ones in vertical position, especially when the colour of the sticks contrasted with the colour of the substrate. Also, rough surfaced substrate was strongly preferred over similar but smooth surfaced substrate. Successful retrieval of and feeding on hoarded acorns were observed in winter even when snow-cover had considerably altered the scenery. No evidence was obtained that acorns could be traced back by smell. Many indications were obtained that visual information from near and far beacons, memorized during hiding, was used in finding acorns. The use of beacons by captive jays was also studied. Experiments led to the conclusion that vertical beacons are more important to retrieving birds than identical horizontal ones. The discrepancy with the jay's preference for horizontal structures during hiding is discussed. Most seedlings emerge in May and June. The distribution pattern of seedlings and bill prints on the shells of their acorns indicated that many seedlings emerged from acorns hidden by jays in the previous autumn. The cotyledons of these plants remain underground and are in excellent condition in spring and early summer. Jays exploited acorns by pulling at the stem of seedlings and then removing the cotyledons. This did not usually damage the plants severely. Jays can find acorns in this situation partly because they remember where they buried acorns. In addition, it was shown that jays select seedlings of oak rather than ones of other species, and that they preferentially inspected those seedlings that were most profitable in terms of cotyledon yield and quality. Experiments uncovered some of the visual cues used in this discrimination. The effects of hoarding on the preservation of acorns were examined in the field and the laboratory. Being buried reduced the chance that acorns were robbed by conspecifics and other acorn feeders. Scatter hoarding did not lead to better protection of buried acorns than larder hoarding, but the spread of risk was better in the former than the latter. It was concluded that the way in which jays hoard acorns increases the chance that they can exploit them later. In addition, the condition of acorns is better preserved by being buried. An analysis was made of the consequences of the jay's behaviour for oaks. The oak does incur certain costs: some of its acorns are eaten by jays during the dispersal and storage phase, and some seedlings are damaged as a consequence of cotyledon removal. However, these costs are outweighed by the benefits the oak receives. Many of its most viable acorns are widely dispersed and buried at sites where the prospects for further development into mature oak are highly favourable. The adaptiveness of the characters involved in preferential feeding on and hoarding of acorns by jays is discussed in relation to several environmental pressures: competition with allied species; food fluctuations in the jay's niche; and food competitors better equipped to break up hard "dry" fruits. Reversely, jays exert several selective pressures which are likely to have evolutionary consequences for oaks, such as the selection of long-slim and large acorns with tight shells. In addition, oak seedlings with a long tap root and tough stem are selected for. Although other factors than mutual selective pressures between the two may have affected the present day fit between jays and oaks it is concluded that several characters of jays and oaks can be considered as co-adapted features of a symbiotic relationship.


1999 ◽  
Vol 9 (6) ◽  
pp. 445-451
Author(s):  
S. Di Girolamo ◽  
W. Di Nardo ◽  
A. Cosenza ◽  
F. Ottaviani ◽  
A. Dickmann ◽  
...  

The role of vision in postural control is crucial and is strictly related to the characteristics of the visual stimulus and to the performance of the visual system. The purpose of this investigation was to evaluate the effects of chronically reduced visual cues upon postural control in patients affected by Congenital Nystagmus (CN). These patients have developed since birth a postural strategy mainly based on vestibular and somatosensorial cues. Fifteen patients affected by CN and 15 normal controls (NC) were enrolled in the study and evaluated by means of dynamic posturography. The overall postural control in CN patients was impaired as demonstrated by the equilibrium score and by the changes of the postural strategy. This impairment was even more enhanced in CN than in NC group when somatosensorial cues were experimentally reduced. An aspecific pattern of visual impairment and a pathological composite score were also present. Our data outline that in patients affected by CN an impairment of the postural balance is present especially when the postural control relies mainly on visual cues. Moreover, a decrease in accuracy of the somatosensory cues has a proportionally greater effect on balance than it has on normal subjects.


2018 ◽  
Vol 40 (1) ◽  
pp. 93-109
Author(s):  
YI ZHENG ◽  
ARTHUR G. SAMUEL

AbstractIt has been documented that lipreading facilitates the understanding of difficult speech, such as noisy speech and time-compressed speech. However, relatively little work has addressed the role of visual information in perceiving accented speech, another type of difficult speech. In this study, we specifically focus on accented word recognition. One hundred forty-two native English speakers made lexical decision judgments on English words or nonwords produced by speakers with Mandarin Chinese accents. The stimuli were presented as either as videos that were of a relatively far speaker or as videos in which we zoomed in on the speaker’s head. Consistent with studies of degraded speech, listeners were more accurate at recognizing accented words when they saw lip movements from the closer apparent distance. The effect of apparent distance tended to be larger under nonoptimal conditions: when stimuli were nonwords than words, and when stimuli were produced by a speaker who had a relatively strong accent. However, we did not find any influence of listeners’ prior experience with Chinese accented speech, suggesting that cross-talker generalization is limited. The current study provides practical suggestions for effective communication between native and nonnative speakers: visual information is useful, and it is more useful in some circumstances than others.


2012 ◽  
Vol 2012 ◽  
pp. 1-17 ◽  
Author(s):  
Aurel Vasile Martiniuc ◽  
Alois Knoll

The information regarding visual stimulus is encoded in spike trains at the output of retina by retinal ganglion cells (RGCs). Among these, the directional selective cells (DSRGC) are signaling the direction of stimulus motion. DSRGCs' spike trains show accentuated periods of short interspike intervals (ISIs) framed by periods of isolated spikes. Here we use two types of visual stimulus, white noise and drifting bars, and show that short ISI spikes of DSRGCs spike trains are more often correlated to their preferred stimulus feature (that is, the direction of stimulus motion) and carry more information than longer ISI spikes. Firstly, our results show that correlation between stimulus and recorded neuronal response is best at short ISI spiking activity and decrease as ISI becomes larger. We then used grating bars stimulus and found that as ISI becomes shorter the directional selectivity is better and information rates are higher. Interestingly, for the less encountered type of DSRGC, known as ON-DSRGC, short ISI distribution and information rates revealed consistent differences when compared with the other directional selective cell type, the ON-OFF DSRGC. However, these findings suggest that ISI-based temporal filtering integrates a mechanism for visual information processing at the output of retina toward higher stages within early visual system.


Sign in / Sign up

Export Citation Format

Share Document