scholarly journals Different mechanisms of magnitude and spatial representation for tactile and auditory modalities

Author(s):  
Alice Bollini ◽  
Davide Esposito ◽  
Claudio Campus ◽  
Monica Gori

AbstractThe human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.

2012 ◽  
Vol 25 (0) ◽  
pp. 18
Author(s):  
Achille Pasqualotto

How do people remember the location of objects? Location is always relative, and thus depends on a reference frame. There are two types of reference frames: egocentric (or observer-based) and allocentric (or environmental-based). Here we investigated the reference frame people used to remember object locations in a large room. We also examined whether the choice of a given reference frame is dictated by visual experience. Thus we tested congenitally blind, late blind, and sighted blindfolded participants. Objects were organized in a structured configuration and then explored one-by-one with participants walking back and forth from a single point. After the exploration of the locations, a spatial memory test was conducted. The memory test required participants to imagine being inside the array of objects, being oriented along a given heading, and then pointing towards the required object. Crucially the headings were either aligned to the allocentric structure of the configuration, that is rows and columns, or aligned to the egocentric route walked during the exploration of the objects. The spatial representation used by the participants can be revealed by better performance when the imagined heading in the test matches the spatial representation used. We found that participants with visual experience, that is late blind and blindfolded sighted, were better with headings aligned to the allocentric structure of the configuration. On the contrary, congenitally blind were more accurate with headings aligned to the egocentric walked routes. This suggests that visual experience during early development determines a preference for an allocentric frame of reference.


Perception ◽  
10.1068/p2984 ◽  
2000 ◽  
Vol 29 (6) ◽  
pp. 745-754 ◽  
Author(s):  
Gail Martino ◽  
Lawrence E Marks

At each moment, we experience a melange of information arriving at several senses, and often we focus on inputs from one modality and ‘reject’ inputs from another. Does input from a rejected sensory modality modulate one's ability to make decisions about information from a selected one? When the modalities are vision and hearing, the answer is “yes”, suggesting that vision and hearing interact. In the present study, we asked whether similar interactions characterize vision and touch. As with vision and hearing, results obtained in a selective attention task show cross-modal interactions between vision and touch that depend on the synesthetic relationship between the stimulus combinations. These results imply that similar mechanisms may govern cross-modal interactions across sensory modalities.


2020 ◽  
Author(s):  
Anna-Katharina R. Bauer ◽  
Freek van Ede ◽  
Andrew J. Quinn ◽  
Anna C. Nobre

AbstractAt any given moment our sensory systems receive multiple, often rhythmic, inputs from the environment. Processing of temporally structured events in one sensory modality can guide both behavioural and neural processing of events in other sensory modalities, but how this occurs remains unclear. Here, we used human electroencephalography (EEG) to test the cross-modal influences of a continuous auditory frequency-modulated (FM) sound on visual perception and visual cortical activity. We report systematic fluctuations in perceptual discrimination of brief visual stimuli in line with the phase of the FM sound. We further show that this rhythmic modulation in visual perception is related to an accompanying rhythmic modulation of neural activity recorded over visual areas. Importantly, in our task, perceptual and neural visual modulations occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. As such, the results provide a critical validation for the existence and functional role of cross-modal entrainment and demonstrates its utility for organising the perception of multisensory stimulation in the natural environment.Highlightscross-modal influences are mediated by the synchronisation of neural oscillationsvisual performance fluctuates in line with the phase of a frequency-modulated soundcross-modal entrainment of neural activity predicts fluctuation in visual performancecross-modal entrainment organises perception of multisensory stimuli


PARADIGMI ◽  
2009 ◽  
pp. 147-162
Author(s):  
Davide Monopoli ◽  
Cristina Cacciari

- The Role of Literal and Figurative Language Olfaction is still the less investigated of the sensory modalities. This also reflects the fact that olfaction is the most subjective and emotional sensory modality and the one with the fewer relationships with verbal language. Since metaphors are cognitive bridges between perception and language, in principle they might be more effective in giving voice to olfaction, the "speechless sense". However, research in this fascinating field is still in its infancy, and the linguistic and psychological results are still scarce and contradicting. Key Words: Cross modal interactions, Language, Metaphor, Olfaction, Perception, Synaesthetic metaphors.


2020 ◽  
Author(s):  
Laura Jane Speed ◽  
Marc Brysbaert

Word meaning is thought to be grounded in the sensory modalities. In order to test such hypotheses in experiments, linguistic stimuli needs to be carefully selected and controlled for. To aid in such investigations, we present a new set of sensory modality norms for over 24,000 Dutch words. The sensory norms comprise perceptual strength ratings in six perceptual modalities: audition, gustation, haptics, olfaction, vision, and interoception. The new norms improve on existing Dutch sensory norms in three ways: 1) they significantly expand on the number of words rated; 2) they include multiple word classes; 3) they add a new perceptual modality: interoception. We show that the sensory norms are able to predict word processing behavior and outperform existing ratings of sensory experience: concreteness and imageability. The data are available via the Open Science Framework (https://osf.io/ubvy2) and serve as a valuable resource for research into the relationship between language and perception.


2020 ◽  
Vol 43 ◽  
Author(s):  
Thomas Parr

Abstract This commentary focuses upon the relationship between two themes in the target article: the ways in which a Markov blanket may be defined and the role of precision and salience in mediating the interactions between what is internal and external to a system. These each rest upon the different perspectives we might take while “choosing” a Markov blanket.


Crisis ◽  
2016 ◽  
Vol 37 (3) ◽  
pp. 212-217 ◽  
Author(s):  
Thomas E. Joiner ◽  
Melanie A. Hom ◽  
Megan L. Rogers ◽  
Carol Chu ◽  
Ian H. Stanley ◽  
...  

Abstract. Background: Lowered eye blink rate may be a clinically useful indicator of acute, imminent, and severe suicide risk. Diminished eye blink rates are often seen among individuals engaged in heightened concentration on a specific task that requires careful planning and attention. Indeed, overcoming one’s biological instinct for survival through suicide necessitates premeditation and concentration; thus, a diminished eye blink rate may signal imminent suicidality. Aims: This article aims to spur research and clinical inquiry into the role of eye blinks as an indicator of acute suicide risk. Method: Literature relevant to the potential connection between eye blink rate and suicidality was reviewed and synthesized. Results: Anecdotal, cognitive, neurological, and conceptual support for the relationship between decreased blink rate and suicide risk is outlined. Conclusion: Given that eye blinks are a highly observable behavior, the potential clinical utility of using eye blink rate as a marker of suicide risk is immense. Research is warranted to explore the association between eye blink rate and acute suicide risk.


2015 ◽  
Vol 36 (3) ◽  
pp. 170-176 ◽  
Author(s):  
Erin N. Stevens ◽  
Joseph R. Bardeen ◽  
Kyle W. Murdock

Parenting behaviors – specifically behaviors characterized by high control, intrusiveness, rejection, and overprotection – and effortful control have each been implicated in the development of anxiety pathology. However, little research has examined the protective role of effortful control in the relation between parenting and anxiety symptoms, specifically among adults. Thus, we sought to explore the unique and interactive effects of parenting and effortful control on anxiety among adults (N = 162). Results suggest that effortful control uniquely contributes to anxiety symptoms above and beyond that of any parenting behavior. Furthermore, effortful control acted as a moderator of the relationship between parental overprotection and anxiety, such that overprotection is associated with anxiety only in individuals with lower levels of effortful control. Implications for potential prevention and intervention efforts which specifically target effortful control are discussed. These findings underscore the importance of considering individual differences in self-regulatory abilities when examining associations between putative early-life risk factors, such as parenting, and anxiety symptoms.


2016 ◽  
Vol 37 (1) ◽  
pp. 31-39 ◽  
Author(s):  
Nicole L. Hofman ◽  
Austin M. Hahn ◽  
Christine K. Tirabassi ◽  
Raluca M. Gaher

Abstract. Exposure to traumatic events and the associated risk of developing Posttraumatic stress disorder (PTSD) symptoms is a significant and overlooked concern in the college population. It is important for current research to identify potential protective factors associated with the development and maintenance of PTSD symptoms unique to this population. Emotional intelligence and perceived social support are two identified protective factors that influence the association between exposure to traumatic events and PTSD symptomology. The current study examined the mediating role of social support in the relationship between emotional intelligence and PTSD symptoms. Participants included 443 trauma-exposed university students who completed online questionnaires. The results of this study indicated that social support mediates the relationship between emotional intelligence and reported PTSD symptoms. Thus, emotional intelligence is significantly associated with PTSD symptoms and social support may play an integral role in the relationship between emotional intelligence and PTSD. The current study is the first to investigate the role of social support in the relationship between emotional intelligence and PTSD symptoms. These findings have important treatment and prevention implications with regard to PTSD.


2017 ◽  
Vol 16 (3) ◽  
pp. 155-159 ◽  
Author(s):  
Peizhen Sun ◽  
Jennifer J. Chen ◽  
Hongyan Jiang

Abstract. This study investigated the mediating role of coping humor in the relationship between emotional intelligence (EI) and job satisfaction. Participants were 398 primary school teachers in China, who completed the Wong Law Emotional Intelligence Scale, Coping Humor Scale, and Overall Job Satisfaction Scale. Results showed that coping humor was a significant mediator between EI and job satisfaction. A further examination revealed, however, that coping humor only mediated two sub-dimensions of EI (use of emotion and regulation of emotion) and job satisfaction. Implications for future research and limitations of the study are discussed.


Sign in / Sign up

Export Citation Format

Share Document