Tactile Capture of Auditory Localization Is Modulated by Hand Posture

Author(s):  
Patrick Bruns ◽  
Brigitte Röder

It is well known that spatial discrepancies between synchronized auditory and visual events can lead to mislocalizations of the auditory stimulus toward the visual stimulus, the so-called ventriloquism effect. Recently, a similar effect of touch on audition has been reported. This study investigated whether this audio-tactile ventriloquism effect depends on hand posture. Participants reported the perceived location of brief auditory stimuli that were presented from left, right, and center locations, either alone or with concurrent tactile stimuli to the fingertips situated at the left and right sides of the speaker array. Compared to unimodal presentations, auditory localization was biased toward the side of the concurrent tactile stimulus in the bimodal trials. This effect was reduced but still significant when participants adopted a crossed-hands posture. In this condition a partial (incomplete) localization bias was observed only for large audio-tactile spatial discrepancies. However, localization was still shifted toward the external location of the tactile stimulus, and not toward the side of the anatomical hand that was stimulated. These results substantiate recent evidence for the existence of an audio-tactile ventriloquism effect and extend these findings by demonstrating that this illusion operates predominantly in an external coordinate system.

1984 ◽  
Vol 59 (1) ◽  
pp. 212-214
Author(s):  
H. W. Craver

The reliability of an attention-focusing technique was assessed for 12 subjects over 4 sessions. Subjects' thought intrusions were counted while they were focusing on either visual or auditory stimuli. Digital temperatures were recorded and an experimental-situation questionnaire was administered. This technique provides extremely reliable self-reports across the sessions. The total number of intrusions was higher for the auditory stimulus than for the visual stimulus. The study's relevance to assessing self-monitoring techniques such as meditation is discussed.


2018 ◽  
Vol 7 ◽  
pp. 172-177
Author(s):  
Łukasz Tyburcy ◽  
Małgorzata Plechawska-Wójcik

The paper describes results of comparison of reactions times to visual and auditory stimuli using EEG evoked potentials. Two experiments were used to applied. The first one explored reaction times to visual stimulus and the second one to auditory stimulus. After conducting an analysis of data, received results enable determining that visual stimuli evoke faster reactions than auditory stimuli.


1976 ◽  
Vol 43 (2) ◽  
pp. 487-493 ◽  
Author(s):  
Robert I. Bermant ◽  
Robert B. Welch

Subjects were exposed to a visual and to an auditory stimulus that differed spatially in laterality of origin. The subjects were observed for visual biasing of auditory localization (the momentary influence of a light on the spatially perceived location of a simultaneously presented sound) and for auditory aftereffect (a change in perceived location of a sound that persists over time and is measured after termination of the visual stimulus). A significant effect of visual stimulation on auditory localization was found only with the measure of bias. Bias was tested as a function of degree of visual-auditory separation (10/20/30°), eye position (straight-ahead/visual stimulus fixation), and position of visual stimulus relative to auditory stimulus (left/right). Only eye position proved statistically significant; straight-ahead eye position induced more bias than did fixation of the visual stimulus.


2002 ◽  
Vol 55 (1b) ◽  
pp. 61-73 ◽  
Author(s):  
John M. Pearce ◽  
David N. George ◽  
Aydan Aydin

Rats received Pavlovian conditioning in which food was signalled by a visual stimulus, A+, an auditory stimulus, B+, and a compound composed of different visual and auditory stimuli, CD+. Test trials were then given with the compound AB. Experiments 1 and 2A revealed stronger responding during AB than during CD. In Experiment 2B, there was no evidence of a summation of responding during AB when A+ B+ training was conducted in the absence of CD+ trials. A further failure to observe abnormally strong responding during ABwas found in Experiment 3 for which the training trials with A+ B+ CD+ were accompanied by trials in which C and D were separately paired with food. The results are explained in terms of a configural theory of conditioning, which assumes that responding during a compound is determined by generalization from its components, as well as from other compounds to which it is similar.


2007 ◽  
Vol 98 (4) ◽  
pp. 2399-2413 ◽  
Author(s):  
Vivian M. Ciaramitaro ◽  
Giedrius T. Buračas ◽  
Geoffrey M. Boynton

Attending to a visual or auditory stimulus often requires irrelevant information to be filtered out, both within the modality attended and in other modalities. For example, attentively listening to a phone conversation can diminish our ability to detect visual events. We used functional magnetic resonance imaging (fMRI) to examine brain responses to visual and auditory stimuli while subjects attended visual or auditory information. Although early cortical areas are traditionally considered unimodal, we found that brain responses to the same ignored information depended on the modality attended. In early visual area V1, responses to ignored visual stimuli were weaker when attending to another visual stimulus, compared with attending to an auditory stimulus. The opposite was true in more central visual area MT+, where responses to ignored visual stimuli were weaker when attending to an auditory stimulus. Furthermore, fMRI responses to the same ignored visual information depended on the location of the auditory stimulus, with stronger responses when the attended auditory stimulus shared the same side of space as the ignored visual stimulus. In early auditory cortex, responses to ignored auditory stimuli were weaker when attending a visual stimulus. A simple parameterization of our data can describe the effects of redirecting attention across space within the same modality (spatial attention) or across modalities (cross-modal attention), and the influence of spatial attention across modalities (cross-modal spatial attention). Our results suggest that the representation of unattended information depends on whether attention is directed to another stimulus in the same modality or the same region of space.


2012 ◽  
Vol 25 (0) ◽  
pp. 24
Author(s):  
Roberto Cecere ◽  
Benjamin De Haas ◽  
Harriett Cullen ◽  
Jon Driver ◽  
Vincenzo Romei

There is converging evidence that the duration of an auditory event can affect the perceived duration of a co-occurring visual event. When a brief visual stimulus is accompanied by a longer auditory stimulus, the perceived visual duration stretches. If this reflects a genuine sustain of visual stimulus perception, it should result in enhanced perception of non-temporal visual stimulus qualities. To test this hypothesis, in a temporal two-alternative forced choice task, 28 participants were asked to indicate whether a short (∼24 ms), peri-threshold, visual stimulus was presented in the first or in the second of two consecutive displays. Each display was accompanied by a sound of equal or longer duration (36, 48, 60, 72, 84, 96, 190 ms) than the visual stimulus. As a control condition, visual stimuli of different durations (matching auditory stimulus durations) were presented alone. We predicted that visual detection can improve as a function of sound duration. Moreover, if the expected cross-modal effect reflects sustained visual perception it should positively correlate with the improvement observed for genuinely longer visual stimuli. Results showed that detection sensitivity (d′) for the 24 ms visual stimulus was significantly enhanced when paired with longer auditory stimuli ranging from 60 to 96 ms duration. The visual detection performance dropped to baseline levels with 190 ms sounds. Crucially, the enhancement for auditory durations 60–96 ms significantly correlates with the d′ enhancement for visual stimuli lasting 60–96 ms in the control condition. We conclude that the duration of co-occurring auditory stimuli not only influences the perceived duration of visual stimuli but reflects a genuine sustain in visual perception.


2012 ◽  
Vol 25 (0) ◽  
pp. 166
Author(s):  
Mario Maiworm ◽  
Marina Bellantoni ◽  
Charles Spence ◽  
Brigitte Roeder

It is currently unknown to what extent the integration of inputs from different modalities are subject to the influence of attention, emotion, and/or motivation. The ventriloquist effect is widely assumed to be an automatic, crossmodal phenomenon, normally shifting the perceived location of an auditory stimulus toward a concurrently-presented visual stimulus. The present study examined whether audiovisual binding, as indicated by the magnitude of the ventriloquist effect, is influenced by threatening auditory stimuli presented prior to the ventriloquist experiment. Syllables spoken in a fearful voice were presented from one of eight loudspeakers while syllables spoken in a neutral voice were presented from the other seven locations. Subsequently, participants had to localize pure tones while trying to ignore concurrent light flashes (both of which were emotionally neutral). A reliable ventriloquist effect was observed. The emotional stimulus manipulation resulted in a reduced ventriloquist effect in both hemifields, as compared to a control group exposed to a similar attention-capturing but non-emotional manipulation. These results suggest that the emotional system is capable of influencing crossmodal binding processes which have heretofore been considered as being automatic.


2021 ◽  
pp. 1-12
Author(s):  
Anna Borgolte ◽  
Ahmad Bransi ◽  
Johanna Seifert ◽  
Sermin Toto ◽  
Gregor R. Szycik ◽  
...  

Abstract Synaesthesia is a multimodal phenomenon in which the activation of one sensory modality leads to an involuntary additional experience in another sensory modality. To date, normal multisensory processing has hardly been investigated in synaesthetes. In the present study we examine processes of audiovisual separation in synaesthesia by using a simultaneity judgement task. Subjects were asked to indicate whether an acoustic and a visual stimulus occurred simultaneously or not. Stimulus onset asynchronies (SOA) as well as the temporal order of the stimuli were systematically varied. Our results demonstrate that synaesthetes are better in separating auditory and visual events than control subjects, but only when vision leads.


1979 ◽  
Vol 49 (3) ◽  
pp. 867-870
Author(s):  
Allan L. Combs ◽  
Dana A. Beezley ◽  
Gary M. Prater ◽  
Gerald F. Henning ◽  
Rhonda F. Cottrell

Among a group of 12 persons selected for the ability to write with ease with either hand, none were found to write using a hooked hand posture with either the right or left hand. Tests of verbal and manipulospatial ability indicated a normal balance of these two types of abilities, usually associated with the left and right hemispheres. Findings are discussed in terms of implications for cerebral organization and hand position in writing.


1969 ◽  
Vol 12 (4) ◽  
pp. 833-839 ◽  
Author(s):  
Kenneth C. Gray ◽  
Dean E. Williams

Changes in pupil size were studied in 24 stuttering and 30 nonstuttering adults during a 4-sec period following the presentation of single-word auditory stimuli and before a signal to respond. Subjects were required first to respond with a single word which was the opposite of the word presented and later to give a one-word free-association response to words of both emotional and neutral connotations. Pupil size was measured also while subjects merely listened to the word stimuli. The process of attending to an auditory stimulus was associated with pupil dilation. Pupil response was significantly greater (in absolute diameter and in dilation) when subjects were required to give an oral response to the stimulus than when they simply listened to the stimulus. Furthermore, the extent of the pupil reaction was related to the nature of the stimulus presented. Such differences in arousal did not occur to any greater degree in stutterers than in nonstutterers. Moreover, among stutterers, measures of pupil size were not predictive of stuttering. Thus, the cues which the stutterer associates with the anticipation of stuttering do not appear to be reflected in the physiological changes associated with pupillary movement.


Sign in / Sign up

Export Citation Format

Share Document