scholarly journals A Psychophysiological Study of Auditory Illusions of Approach and Withdrawal in the Context of the Perceptual Environment

2007 ◽  
Vol 10 (2) ◽  
pp. 266-276 ◽  
Author(s):  
Inna A. Vartanyan ◽  
Irina G. Andreeva

Auditory perception of the depth of space is based mainly on spectral and amplitude changes of sound waves originating from the sound source and reaching the listener. The perceptive illusion of movement of an auditory image caused by changes in amplitude and/or frequency of the signal tone emanating from an immobile loudspeaker was studied. Analysis of data obtained from the participants revealed the diapason of combinations of amplitude and frequency changes for which the movement direction was perceived similarly by all participants, despite significantly different movement assessment criteria. Additional auditory and visual information of the conditions of radial movement (near or far fields) determined listeners' interpretation of changes in the signal parameters. The data obtained about the perception of approach and withdrawal models are evidence of the fact that the principal cues of the perception of the distance of immobile sound sources manifests similarly to that of an auditory image moving along a radial axis.

2021 ◽  
Vol 13 (1) ◽  
pp. 339
Author(s):  
Yoshimi Hasegawa ◽  
Siu-Kit Lau

A growing number of soundscape studies involving audiovisual factors have been conducted; however, their bimodal and interactive effects on indoor soundscape evaluations have not yet been thoroughly reviewed. The overarching goal of this systematic review was to develop the framework for designing sustainable indoor soundscapes by focusing on audiovisual factors and relations. A search for individual studies was conducted through three databases and search engines: Scopus, Web of Science, and PubMed. Based on the qualitative reviews of the selected thirty papers, a framework of indoor soundscape evaluation concerning visual and audiovisual indicators was proposed. Overall, the greenery factor was the most important visual variable, followed by the water features and moderating noise annoyance perceived by occupants in given indoor environments. The presence of visual information and sound-source visibility would moderate perceived noise annoyance and influence other audio-related perceptions. Furthermore, sound sources would impact multiple perceptual responses (audio, visual, cognitive, and emotional perceptions) related to the overall soundscape experiences when certain visual factors are interactively involved. The proposed framework highlights the potential use of the bimodality and interactivity of the audiovisual factors for designing indoor sound environments in more effective ways.


2021 ◽  
Vol 2 ◽  
Author(s):  
Thirsa Huisman ◽  
Axel Ahrens ◽  
Ewen MacDonald

To reproduce realistic audio-visual scenarios in the laboratory, Ambisonics is often used to reproduce a sound field over loudspeakers and virtual reality (VR) glasses are used to present visual information. Both technologies have been shown to be suitable for research. However, the combination of both technologies, Ambisonics and VR glasses, might affect the spatial cues for auditory localization and thus, the localization percept. Here, we investigated how VR glasses affect the localization of virtual sound sources on the horizontal plane produced using either 1st-, 3rd-, 5th- or 11th-order Ambisonics with and without visual information. Results showed that with 1st-order Ambisonics the localization error is larger than with the higher orders, while the differences across the higher orders were small. The physical presence of the VR glasses without visual information increased the perceived lateralization of the auditory stimuli by on average about 2°, especially in the right hemisphere. Presenting visual information about the environment and potential sound sources did reduce this HMD-induced shift, however it could not fully compensate for it. While the localization performance itself was affected by the Ambisonics order, there was no interaction between the Ambisonics order and the effect of the HMD. Thus, the presence of VR glasses can alter acoustic localization when using Ambisonics sound reproduction, but visual information can compensate for most of the effects. As such, most use cases for VR will be unaffected by these shifts in the perceived location of the auditory stimuli.


2015 ◽  
Vol 114 (4) ◽  
pp. 2187-2193 ◽  
Author(s):  
Shoko Kasuga ◽  
Sebastian Telgen ◽  
Junichi Ushiba ◽  
Daichi Nozaki ◽  
Jörn Diedrichsen

When we learn a novel task, the motor system needs to acquire both feedforward and feedback control. Currently, little is known about how the learning of these two mechanisms relate to each other. In the present study, we tested whether feedforward and feedback control need to be learned separately, or whether they are learned as common mechanism when a new control policy is acquired. Participants were trained to reach to two lateral and one central target in an environment with mirror (left-right)-reversed visual feedback. One group was allowed to make online movement corrections, whereas the other group only received visual information after the end of the movement. Learning of feedforward control was assessed by measuring the accuracy of the initial movement direction to lateral targets. Feedback control was measured in the responses to sudden visual perturbations of the cursor when reaching to the central target. Although feedforward control improved in both groups, it was significantly better when online corrections were not allowed. In contrast, feedback control only adaptively changed in participants who received online feedback and remained unchanged in the group without online corrections. Our findings suggest that when a new control policy is acquired, feedforward and feedback control are learned separately, and that there may be a trade-off in learning between feedback and feedforward controllers.


Author(s):  
Bryan Dickens ◽  
Steven Sellers ◽  
Gabe Harms ◽  
Owen Shartle ◽  
Conrad S. Tucker

The authors of this work propose a virtual reality approach that overcomes two fundamental challenges experienced in physical learning environments; i) variations in audial quality, and ii) variations in visual quality, in an effort to achieve individual customization of information content. In physical brick and mortar environments, the dissemination of information is influenced by the medium that the information travels through, which is typically distorted by line of sight constraints and constraints that distort sound waves. The fundamental research question is how to achieve consistent quality of information being disseminated, as the number of audience members increases? There exists a knowledge gap relating to the creation of a scalable, networked, system for enabling real time, information exchange. The authors propose a virtual reality approach to address these limitations of physical learning spaces that minimizes the variability in audial and visual information dissemination. A real time, networked architecture is proposed that enables multiple individuals to simultaneously experience the same quality of audial and visual information, based on the optimal geospatial position for audial and visual exposure determined. A case study is introduced that first quantifies simulations of the audial and visual information loss experienced by audience members receiving information at different geospatial locations in a brick and mortar environment. This information loss is compared against the proposed virtual reality architecture that minimizes the variation in information dissemination. The authors demonstrate that the proposed solution is an improved, scalable multi-user system, unlike brick and mortar environments that are constrained by size and geospatial positioning.


1999 ◽  
Vol 81 (2) ◽  
pp. 875-894 ◽  
Author(s):  
M.T.V. Johnson ◽  
J. D. Coltz ◽  
M. C. Hagen ◽  
T. J. Ebner

Johnson, M.T.V., J. D. Coltz, M. C. Hagen, and T. J. Ebner. Visuomotor processing as reflected in the directional discharge of premotor and primary motor cortex neurons. J. Neurophysiol. 81: 875–894, 1999. Premotor and primary motor cortical neuronal firing was studied in two monkeys during an instructed delay, pursuit tracking task. The task included a premovement “cue period,” during which the target was presented at the periphery of the workspace and moved to the center of the workspace along one of eight directions at one of four constant speeds. The “track period” consisted of a visually guided, error-constrained arm movement during which the animal tracked the target as it moved from the central start box along a line to the opposite periphery of the workspace. Behaviorally, the animals tracked the required directions and speeds with highly constrained trajectories. The eye movements consisted of saccades to the target at the onset of the cue period, followed by smooth pursuit intermingled with saccades throughout the cue and track periods. Initially, an analysis of variance (ANOVA) was used to test for direction and period effects in the firing. Subsequently, a linear regression analysis was used to fit the average firing from the cue and track periods to a cosine model. Directional tuning as determined by a significant fit to the cosine model was a prominent feature of the discharge during both the cue and track periods. However, the directional tuning of the firing of a single cell was not always constant across the cue and track periods. Approximately one-half of the neurons had differences in their preferred directions (PDs) of >45° between cue and track periods. The PD in the cue or track period was not dependent on the target speed. A second linear regression analysis based on calculation of the preferred direction in 20-ms bins (i.e., the PD trajectory) was used to examine on a finer time scale the temporal evolution of this change in directional tuning. The PD trajectories in the cue period were not straight but instead rotated over the workspace to align with the track period PD. Both clockwise and counterclockwise rotations occurred. The PD trajectories were relatively straight during most of the track period. The rotation and eventual convergence of the PD trajectories in the cue period to the preferred direction of the track period may reflect the transformation of visual information into motor commands. The widely dispersed PD trajectories in the cue period would allow targets to be detected over a wide spatial aperture. The convergence of the PD trajectories occurring at the cue-track transition may serve as a “Go” signal to move that was not explicitly supplied by the paradigm. Furthermore, the rotation and convergence of the PD trajectories may provide a mechanism for nonstandard mapping. Standard mapping refers to a sensorimotor transformation in which the stimulus is the object of the reach. Nonstandard mapping is the mapping of an arbitrary stimulus into an arbitrary movement. The shifts in the PD may allow relevant visual information from any direction to be transformed into an appropriate movement direction, providing a neural substrate for nonstandard stimulus-response mappings.


2008 ◽  
Vol 107 (2) ◽  
pp. 457-472 ◽  
Author(s):  
Chia-Liang Tsai ◽  
Sheng-Kuang Wu

The study explored the relations of visual perceptual deficits and motor impairments in 60 children with Developmental Coordination Disorder (120.8 ± 4.0 mo.) and 60 controls (121.0 ± 5.3 mo.), who were matched by sex (29 boys and 31 girls) and age. They were separately assessed on fine and gross motor-dexterity tasks of the Movement Assessment Battery for Children, static balance and reaction time of lower extremities with eyes open or closed, and the Test of Visual-Perceptual Skills–Revised. Analysis showed the children with Developmental Coordination Disorder performed significantly worse than the control group, but only the visual perception and motor skills with timed responses were significantly correlated. When visual information was controlled, no significant correlation was noted, so motor-free visual perception appears to be significantly related to motor performance having a speed component in these children.


1992 ◽  
Vol 74 (2) ◽  
pp. 443-448 ◽  
Author(s):  
Hitoshi Okada ◽  
Kazuo Matsuoka

The purpose of this study was to examine whether the auditory image of a pure tone facilitates or interferes with the auditory perception of the pure tone. The masked threshold of a pure tone in white noise with and without the image of a pure tone was compared. It was shown that, in contrast to Farah and Smith's (1983) finding of facilitation, imagery interfered with the detection of the pure tone only when the frequency of the imagined tone and the detected tone was the same. This interference was interpreted as showing the assimilation of the signal tone into imagery, i.e., the effect described by Perky in 1910, occurred in the auditory modality. An explanation of the differences between findings of interference and facilitation is offered.


1996 ◽  
Vol 76 (3) ◽  
pp. 2071-2076 ◽  
Author(s):  
B. Stricanne ◽  
R. A. Andersen ◽  
P. Mazzoni

1. The lateral intraparietal area (LIP) of the posterior parietal cortex lies within the dorsal cortical stream for spatial vision and processes visual information to plan saccadic eye movements. We investigated how LIP neurons respond when a monkey makes saccades to the remembered location of sound sources in the absence of visual stimulation. 2. Forty-three (36%) of the 118 neurons sampled showed significant auditory triggered activity during the memory period. This figure is similar to the proportion of cells showing visually triggered memory activity. 3. Of the cells showing auditory memory activity, 44% discharged in an eye-centered manner, similar to the way in which LIP cells discharge for visually initiated saccades. Another 33% responded in head-centered coordinates, and the remaining 23% had responses intermediate between the two reference frames. 4. For a substantial number of cells in all three categories, the magnitude of the response was modulated by eye position. Similar orbital "gain fields" had been shown previously for visual saccades. 5. We propose that area LIP is either at the origin of, or participates in, the transformation of auditory signals for oculomotor purposes, and that orbital gains on the discharge are part of this process. 6. Finally, we suggest that, by the level of area LIP, cells are concerned with the abstract quality of where a stimulus is in space, independent of the exact nature of the stimulus.


Author(s):  
Santiago Naranjo-Sierra ◽  
Lauren K. Ng Tucker

Ultrasonography is the use of sound waves to create images and is used mainly for diagnostic purposes and for real-time guidance during procedures. Point-of-care ultrasonography is widely used in fields such as anesthesia, critical care, and emergency medicine, in which it is becoming an important part of the current standard of care because of its ability to provide accurate visual information about a patient, either to rapidly evaluate clinical status or to provide guidance for procedures, without requiring transfers to other areas. For patients in an intensive care unit, focused ultrasonography has been reported to result in management changes in more than 50%.


Sign in / Sign up

Export Citation Format

Share Document