Auditory gating during visually-guided action?

2012 ◽  
Vol 25 (0) ◽  
pp. 106 ◽  
Author(s):  
Luc Tremblay ◽  
Joanne Wong ◽  
Gerome Manson

We recently used an audiovisual illusion (Shams et al., 2000) during fast and accurate reaching movements and showed that susceptibility to the fusion illusion is reduced at high limb velocities (Tremblay and Nguyen, 2010). This study aimed to determine if auditory information processing is suppressed during voluntary action (Chapman and Beauchamp, 2006), which could explain reduced fusion during reaching movements. Instead of asking our participants () to report the number of flashes, we asked them to report the number of beeps (Andersen et al., 2004). Before each trial, participants were asked to fixate on a target LED presented on a horizontal reaching surface. The secondary stimuli combined 3 flash (0, 1, 2) by 2 beep (1, 2). During control tests, the secondary stimuli were presented at rest. In the experimental phase, stimuli were presented 0, 100 or 200 ms relative to the onset of a fast and accurate movement. Participants reported the number of beeps after each trial. A 3 flash × 2 beep × 4 presentation condition (0, 100, 200 ms + Control) ANOVA revealed that participants were less accurate at perceiving the actual number of beeps during the movement as compared to the control condition. More importantly, the number of flashes influenced the number of perceived beeps during the movement but not in the control condition. Lastly, no relationship was found between limb velocity and the number of perceived beeps. These results indicate that auditory information is significantly suppressed during goal-directed action but this mechanism alone fails to explain the link between limb velocity and the fusion illusion.

2008 ◽  
Vol 31 (2) ◽  
pp. 220-221 ◽  
Author(s):  
David Whitney

AbstractAccurate perception of moving objects would be useful; accurate visually guided action is crucial. Visual motion across the scene influences perceived object location and the trajectory of reaching movements to objects. In this commentary, I propose that the visual system assigns the position of any object based on the predominant motion present in the scene, and that this is used to guide reaching movements to compensate for delays in visuomotor processing.


1997 ◽  
Vol 35 (4) ◽  
pp. 191-196 ◽  
Author(s):  
B. Van Sweden ◽  
M.G. Van Erp ◽  
F. Mesotten

Author(s):  
Malte Asendorf ◽  
Moritz Kienzle ◽  
Rachel Ringe ◽  
Fida Ahmadi ◽  
Debaditya Bhowmik ◽  
...  

This paper presents Tiltification, a multi modal spirit level application for smartphones. The non-profit app was produced by students in the master project “Sonification Apps” in winter term 2020/21 at the University of Bremen. In the app, psychoacoustic sonification is used to give feedback on the device’s rotation angles in two plane dimensions, allowing users to level furniture or take perfectly horizontal photos. Tiltification supplements the market of spirit level apps with the unique feature of auditory information processing. This provides for additional benefit in comparison to a physical spirit level and for more accessibility for visu- ally and cognitively impaired people. We argue that the distribution of sonification apps through mainstream channels is a contribution to establish sonification in the market and make it better known to users outside the scientific domain. We hope that the auditory display community will support us by using and recommending the app and by providing valuable feedback on the app functionality and design, and on our communication, advertisement and distribution strategy.


2005 ◽  
Vol 43 (2) ◽  
pp. 216-226 ◽  
Author(s):  
Jonathan S. Cant ◽  
David A. Westwood ◽  
Kenneth F. Valyear ◽  
Melvyn A. Goodale

2019 ◽  
Vol 1720 ◽  
pp. 146307 ◽  
Author(s):  
Anastasia M. Bobilev ◽  
Matthew E. Hudgens-Haney ◽  
Jordan P. Hamm ◽  
William T. Oliver ◽  
Jennifer E. McDowell ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document