audiovisual interaction
Recently Published Documents


TOTAL DOCUMENTS

20
(FIVE YEARS 3)

H-INDEX

4
(FIVE YEARS 0)

Author(s):  
Hui Xie ◽  
Yi He ◽  
Xueying Wu ◽  
Yi Lu

Historic districts play a vital role in stimulating urban economic development, conserving regional culture, and enhancing public participation. Both auditory and visual environments, and the interplay between them, are critical to visitors’ perception and evaluation of historic districts. However, most studies have explored either the auditory or visual environments separately. The handful of existing studies on audiovisual interaction were confined to laboratory environments, leading to limited external validity. Here, we performed a data-driven study of the features of auditory and visual environments and the interaction between them in 17 historic towns in China using posts containing soundscape-related keywords and streetscape photos from a popular Chinese social media platform. First, we found that the auditory environments in historic districts mainly consist of man-made sounds from folkloric activities, the sounds of street shop vendors, and natural sounds from running water and birds. Second, street greenery, spatial enclosure, and presence of pedestrian in visual environment are positively associated with emotional feedback of the soundscape. This study and others support the importance of studying the auditory and visual environments of historic districts in conjunction. The innovative methods used in this study can be used in further studies in the field.


2021 ◽  
pp. 120-129
Author(s):  
M. Omar Chohan ◽  
Martina Stippler ◽  
Susy Salvo Wendt ◽  
Howard Yonas

Teleneurosurgery can play a vital role in the care of patients in hospitals and community health settings where neurosurgical expertise is not available. The combination of audiovisual interaction of a neurosurgeon with the emergency physician, the patient and the patient’s family, combined with an intense education program delivered to the originating site care team, has greatly enhanced the appropriate triage of patients in community hospitals. The result is better patient care, improved patient and family satisfaction, cost savings, and the retention of patients within the local community care system, as well as the improved sustainability of the wider health delivery system. To succeed, start-up financial support is often needed to provide the required technical elements and 24/7 neurosurgical availability.


2021 ◽  
Vol 11 (16) ◽  
pp. 7546
Author(s):  
Katashi Nagao ◽  
Kaho Kumon ◽  
Kodai Hattori

In building-scale VR, where the entire interior of a large-scale building is a virtual space that users can walk around in, it is very important to handle movable objects that actually exist in the real world and not in the virtual space. We propose a mechanism to dynamically detect such objects (that are not embedded in the virtual space) in advance, and then generate a sound when one is hit with a virtual stick. Moreover, in a large indoor virtual environment, there may be multiple users at the same time, and their presence may be perceived by hearing, as well as by sight, e.g., by hearing sounds such as footsteps. We, therefore, use a GAN deep learning generation system to generate the impact sound from any object. First, in order to visually display a real-world object in virtual space, its 3D data is generated using an RGB-D camera and saved, along with its position information. At the same time, we take the image of the object and break it down into parts, estimate its material, generate the sound, and associate the sound with that part. When a VR user hits the object virtually (e.g., hits it with a virtual stick), a sound is generated. We demonstrate that users can judge the material from the sound, thus confirming the effectiveness of the proposed method.


2018 ◽  
pp. 435-463
Author(s):  
Dominik Strohmeier ◽  
Satu Jumisko-Pyykkö

Perception ◽  
2018 ◽  
Vol 47 (7) ◽  
pp. 751-771 ◽  
Author(s):  
Daiki Yamasaki ◽  
Kiyofumi Miyoshi ◽  
Christian F. Altmann ◽  
Hiroshi Ashida

In spite of accumulating evidence for the spatial rule governing cross-modal interaction according to the spatial consistency of stimuli, it is still unclear whether 3D spatial consistency (i.e., front/rear of the body) of stimuli also regulates audiovisual interaction. We investigated how sounds with increasing/decreasing intensity (looming/receding sound) presented from the front and rear space of the body impact the size perception of a dynamic visual object. Participants performed a size-matching task (Experiments 1 and 2) and a size adjustment task (Experiment 3) of visual stimuli with increasing/decreasing diameter, while being exposed to a front- or rear-presented sound with increasing/decreasing intensity. Throughout these experiments, we demonstrated that only the front-presented looming sound caused overestimation of the spatially consistent looming visual stimulus in size, but not of the spatially inconsistent and the receding visual stimulus. The receding sound had no significant effect on vision. Our results revealed that looming sound alters dynamic visual size perception depending on the consistency in the approaching quality and the front–rear spatial location of audiovisual stimuli, suggesting that the human brain differently processes audiovisual inputs based on their 3D spatial consistency. This selective interaction between looming signals should contribute to faster detection of approaching threats. Our findings extend the spatial rule governing audiovisual interaction into 3D space.


i-Perception ◽  
2015 ◽  
Vol 6 (4) ◽  
pp. 204166951559933 ◽  
Author(s):  
Hannah Goldberg ◽  
Yile Sun ◽  
Timothy J. Hickey ◽  
Barbara Shinn-Cunningham ◽  
Robert Sekuler

2015 ◽  
Vol 112 (27) ◽  
pp. 8493-8498 ◽  
Author(s):  
Minyoung Lee ◽  
Randolph Blake ◽  
Sujin Kim ◽  
Chai-Youn Kim

Predictive influences of auditory information on resolution of visual competition were investigated using music, whose visual symbolic notation is familiar only to those with musical training. Results from two experiments using different experimental paradigms revealed that melodic congruence between what is seen and what is heard impacts perceptual dynamics during binocular rivalry. This bisensory interaction was observed only when the musical score was perceptually dominant, not when it was suppressed from awareness, and it was observed only in people who could read music. Results from two ancillary experiments showed that this effect of congruence cannot be explained by differential patterns of eye movements or by differential response sluggishness associated with congruent score/melody combinations. Taken together, these results demonstrate robust audiovisual interaction based on high-level, symbolic representations and its predictive influence on perceptual dynamics during binocular rivalry.


Author(s):  
Weiping Yang ◽  
Yulin Gao ◽  
Jinglong Wu

In everyday life, visual and auditory are the most common forms of sensory information. Therefore, audiovisual interaction in the brain plays an important role in performance and perception. In addition, our attention system allows us to dynamically select and enhance the processing of objects and events that are the most relevant at each moment. Some studies suggest that attention can modulate audiovisual integration. However, different neural activity of multimodal audiovisual integration can be seen in different attention conditions. This review focuses on the question of what affects selective and divided attention in audiovisual interaction. Neural activities of audiovisual under selective and divided attention conditions are also discussed. This review aims to bring together and summarize previous studies on the interactions between attention and audiovisual integration.


i-Perception ◽  
10.1068/ic833 ◽  
2011 ◽  
Vol 2 (8) ◽  
pp. 833-833
Author(s):  
Kuan-Ming Chen ◽  
Su-Ling Yeh

Sign in / Sign up

Export Citation Format

Share Document