Visual and Kinesthetic Control of Goal-Directed Movements to Visually and Kinesthetically Presented Targets

1998 ◽  
Vol 86 (3_suppl) ◽  
pp. 1375-1391 ◽  
Author(s):  
Y. Laufer ◽  
S. Hocherman

The study investigated the contribution of kinesthetic and visual input to the performance of reaching movements and identified rules governing the transformation of information between these two sensory modalities. The study examined the accuracy by which 39 subjects reproduced locations of five targets in a horizontal plane. Mode of target presentation and feedback during reproduction of a target's location was either visual, kinesthetic or a combination of both modalities. Thus, it was possible to examine performance when target presentation and reproduction involved feedback from the same sensory modality (intramodal) as well as from different sensory modalities (intermodal). Errors in target reproduction were calculated in terms of distance and systematic biases in movement extent. The major findings of the study are (1) Intramodal reproduction of a target's location on the basis of kinesthetic feedback is somewhat less accurate than intramodal reproduction on the basis of visual feedback. (2) Intermodal performance is significantly less accurate than intramodal performance. (3) Accuracy of performance does not depend on the direction of information transfer between sensory modalities. (4) Intermodal performance is characterized by systematic biases in extent of movement which are dependent on the direction of information transfer between modalities. (5) When presentation of the target's location is bimodal, reproduction is adversely affected by the conflicting input. The results suggest that transformation rules, used to combine input from various sensory modalities, depend on environmental conditions and attention.

2020 ◽  
pp. 3-14
Author(s):  
O. M. Samoylenko ◽  
O. V. Adamenko ◽  
B. P. Kukareka

Reference method for simultaneous calibration of the three and more measurement standards for vertical angle measurement is developed. This method can to use for obtaining the systematic biases of the vertical angles measurements for each of the measuring standards relative of the horizontal plain was averaged from measurement results in time their calibration or comparison. For realization of the reference method was developed the autocollimationel electronic measurement standard for the automatization measurement of the vertical angles SeaLineZero_Standard™ (SLZ_S™). Summary standard deviation (k=1) of the vertical angle measurement relative the horizontal plane, from the results of their calibration by reference method, is not more 0,07ʺ…0,15ʺ. This result was obtained without the use the systematic biases, for each measurement standards, as measurements corrections (with opposite sign). The measuring standards, that were developed and researched, are necessary for obtaining the systematic biases of the vertical angle measurement for total stations and theodolites, that have the normed standard error 0,5ʺ and 1ʺ, when these instruments are calibrating.


2021 ◽  
pp. 214-220
Author(s):  
Wei Lin Toh ◽  
Neil Thomas ◽  
Susan L. Rossell

There has been burgeoning interest in studying hallucinations in psychosis occurring across multiple sensory modalities. The current study aimed to characterize the auditory hallucination and delusion profiles in patients with auditory hallucinations only versus those with multisensory hallucinations. Participants with psychosis were partitioned into groups with voices only (AVH; <i>n</i> = 50) versus voices plus hallucinations in at least one other sensory modality (AVH+; <i>n</i> = 50), based on their responses on the Scale for the Assessment of Positive Symptoms (SAPS). Basic demographic and clinical information was collected, and the Questionnaire for Psychotic Experiences (QPE) was used to assess psychosis phenomenology. Relative to the AVH group, greater compliance to perceived commands, auditory illusions, and sensed presences was significantly elevated in the AVH+ group. The latter group also had greater levels of delusion-related distress and functional impairment and was more likely to endorse delusions of reference and misidentification. This preliminary study uncovered important phenomenological differences in those with multisensory hallucinations. Future hallucination research extending beyond the auditory modality is needed.


2016 ◽  
Vol 14 (3) ◽  
pp. 21-31 ◽  
Author(s):  
O.B. Bogdashina

Synaesthesia — a phenomenon of perception, when stimulation of one sensory modality triggers a perception in one or more other sensory modalities. Synaesthesia is not uniform and can manifest itself in different ways. As the sensations and their interpretation vary in different periods of time, it makes it hard to study this phenom¬enon. The article presents the classification of different forms of synaesthesia, including sensory and cognitive; and bimodal and multimodal synaesthesia. Some synaesthetes have several forms and variants of synaesthesia, while others – just one form of it. Although synaesthesia is not specific to autism spectrum disorders, it is quite common among autistic individuals. The article deals with the most common forms of synaesthesia in autism, advantages and problems of synesthetic perception in children with autism spectrum disorders, and provides some advice to parents how to recognise synaesthesia in children with autism.


Author(s):  
Drew McRacken ◽  
Maddie Dyson ◽  
Kevin Hu

Over the past few decades, there has been a significant number of reports that suggested that reaction times for different sensory modalities were different – e.g., that visual reaction time was slower than tactile reaction time. A recent report by Holden and colleagues stated that (1) there has been a significant historic upward drift in reaction times reported in the literature, (2) that this drift or degradation in reaction times could be accounted for by inaccuracies in the methods used and (3) that these inaccurate methods led to inaccurate reporting of differences between visual and tactile based reaction time testing.  The Holden study utilized robotics (i.e., no human factors) to test visual and tactile reaction time methods but did not assess how individuals would perform on different sensory modalities.  This study utilized three different sensory modalities: visual, auditory, and tactile, to test reaction time. By changing the way in which the subjects were prompted and measuring subsequent reaction time, the impact of sensory modality could be analyzed. Reaction time testing for two sensory modalities, auditory and visual, were administered through an Arduino Uno microcontroller device, while tactile-based reaction time testing was administered with the Brain Gauge. A range of stimulus intensities was delivered for the reaction times delivered by each sensory modality. The average reaction time and reaction time variability was assessed and a trend could be identified for the reaction time measurements of each of the sensory modalities. Switching the sensory modality did not result in a difference in reaction time and it was concluded that this was due to the implementation of accurate circuitry used to deliver each test. Increasing stimulus intensity for each sensory modality resulted in faster reaction times. The results of this study confirm the findings of Holden and colleagues and contradict the results reported in countless studies that conclude that (1) reaction times are historically slower now than they were 50 years ago and (2) that there are differences in reaction times for different sensory modalities (vision, hearing, tactile). The implications of this are that utilization of accurate reaction time methods could have a significant impact on clinical outcomes and that many methods in current clinical use are basically perpetuating poor methods and wasting time and money of countless subjects or patients.


Author(s):  
Marcella Montagnese ◽  
Pantelis Leptourgos ◽  
Charles Fernyhough ◽  
Flavie Waters ◽  
Frank Larøi ◽  
...  

Abstract Hallucinations can occur in different sensory modalities, both simultaneously and serially in time. They have typically been studied in clinical populations as phenomena occurring in a single sensory modality. Hallucinatory experiences occurring in multiple sensory systems—multimodal hallucinations (MMHs)—are more prevalent than previously thought and may have greater adverse impact than unimodal ones, but they remain relatively underresearched. Here, we review and discuss: (1) the definition and categorization of both serial and simultaneous MMHs, (2) available assessment tools and how they can be improved, and (3) the explanatory power that current hallucination theories have for MMHs. Overall, we suggest that current models need to be updated or developed to account for MMHs and to inform research into the underlying processes of such hallucinatory phenomena. We make recommendations for future research and for clinical practice, including the need for service user involvement and for better assessment tools that can reliably measure MMHs and distinguish them from other related phenomena.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Lucilla Cardinali ◽  
Andrea Serino ◽  
Monica Gori

Abstract Cortical body size representations are distorted in the adult, from low-level motor and sensory maps to higher levels multisensory and cognitive representations. Little is known about how such representations are built and evolve during infancy and childhood. Here we investigated how hand size is represented in typically developing children aged 6 to 10. Participants were asked to estimate their hand size using two different sensory modalities (visual or haptic). We found a distortion (underestimation) already present in the youngest children. Crucially, such distortion increases with age and regardless of the sensory modality used to access the representation. Finally, underestimation is specific for the body as no bias was found for object estimation. This study suggests that the brain does not keep up with the natural body growth. However, since motor behavior nor perception were impaired, the distortion seems functional and/or compensated for, for proper interaction with the external environment.


1962 ◽  
Vol 203 (5) ◽  
pp. 799-802 ◽  
Author(s):  
S. T. Kitai ◽  
F. Morin

The dorsal spinocerebellar tract (DSCT) at C-1, C-2, and the lower medulla level was studied with microelectrodes in lightly anesthetized cats. All responses were obtained from the stimulation of the ipsilateral side of the body. The sensory modalities activating the total of 242 fibers studied were touch (53%), pressure (31%), touch and pressure (2%), and joint movement (14%). Responses to touch were more numerous for the forelimb, while responses to pressure and to joint movement were more numerous for the hind limb. Regardless of modalities the trunk was significantly less represented in the DSCT than the limbs. Tactile and pressure peripheral fields were restricted (i.e., a few hairs of a paw) and large (i.e., more than one segment of a limb). The ratio of restricted to large fields for touch was 7 to 1, and for pressure 5 to 1. Fibers activated by joint movements adjusted their frequency of firing to the degree of displacement and to the rate of the movement. There was no evidence for a separate anatomical segregation of fibers responding to a single sensory modality.


2020 ◽  
Vol 287 (1928) ◽  
pp. 20200944 ◽  
Author(s):  
Nicholas M. Michalak ◽  
Oliver Sng ◽  
Iris M. Wang ◽  
Joshua Ackerman

Cough, cough. Is that person sick, or do they just have a throat tickle? A growing body of research suggests pathogen threats shape key aspects of human sociality. However, less research has investigated specific processes involved in pathogen threat detection. Here, we examine whether perceivers can accurately detect pathogen threats using an understudied sensory modality—sound. Participants in four studies judged whether cough and sneeze sounds were produced by people infected with a communicable disease or not. We found no evidence that participants could accurately identify the origins of these sounds. Instead, the more disgusting they perceived a sound to be, the more likely they were to judge that it came from an infected person (regardless of whether it did). Thus, unlike research indicating perceivers can accurately diagnose infection using other sensory modalities (e.g. sight, smell), we find people overperceive pathogen threat in subjectively disgusting sounds.


2014 ◽  
Vol 112 (9) ◽  
pp. 2290-2301 ◽  
Author(s):  
Jean Blouin ◽  
Anahid H. Saradjian ◽  
Nicolas Lebar ◽  
Alain Guillaume ◽  
Laurence Mouchnino

Behavioral studies have suggested that the brain uses a visual estimate of the hand to plan reaching movements toward visual targets and somatosensory inputs in the case of somatosensory targets. However, neural correlates for distinct coding of the hand according to the sensory modality of the target have not yet been identified. Here we tested the twofold hypothesis that the somatosensory input from the reaching hand is facilitated and inhibited, respectively, when planning movements toward somatosensory (unseen fingers) or visual targets. The weight of the somatosensory inputs was assessed by measuring the amplitude of the somatosensory evoked potential (SEP) resulting from vibration of the reaching finger during movement planning. The target sensory modality had no significant effect on SEP amplitude. However, Spearman's analyses showed significant correlations between the SEPs and reaching errors. When planning movements toward proprioceptive targets without visual feedback of the reaching hand, participants showing the greater SEPs were those who produced the smaller directional errors. Inversely, participants showing the smaller SEPs when planning movements toward visual targets with visual feedback of the reaching hand were those who produced the smaller directional errors. No significant correlation was found between the SEPs and radial or amplitude errors. Our results indicate that the sensory strategy for planning movements is highly flexible among individuals and also for a given sensory context. Most importantly, they provide neural bases for the suggestion that optimization of movement planning requires the target and the reaching hand to both be represented in the same sensory modality.


Behaviour ◽  
2013 ◽  
Vol 150 (12) ◽  
pp. 1467-1489 ◽  
Author(s):  
Arielle Duhaime-Ross ◽  
Geneviève Martel ◽  
Frédéric Laberge

Many animals use and react to multimodal signals — signals that occur in more than one sensory modality. This study focused on the respective roles of vision, chemoreception, and their possible interaction in determining agonistic responses of the red-backed salamander, Plethodon cinereus. The use of a computer display allowed separate or combined presentation of visual and chemical cues. A cue isolation experiment using adult male and juvenile salamanders showed that both visual and chemical cues from unfamiliar male conspecifics could increase aggressive displays. Submissive displays were only increased in juveniles, and specifically by the visual cue. The rate of chemoinvestigation of the substrate was increased only by chemical cues in adults, whereas both chemical and visual cues increased this behaviour in juveniles. Chemoinvestigation appears, thus, more dependent on sensory input in juvenile salamanders. A follow-up experiment comparing responses to visual cues of different animals (conspecific salamander, heterospecific salamander and earthworm) or an inanimate object (wood stick) showed that exploratory behaviour was higher in the presence of the inanimate object stimulus. The heterospecific salamander stimulus produced strong submissive and escape responses, while the conspecific salamander stimulus promoted aggressive displays. Finally, the earthworm stimulus increased both aggressive and submissive behaviours at intermediate levels when compared to salamander cues. These specific combinations of agonistic and exploratory responses to each stimulus suggest that salamanders could discriminate the cues visually. This study sheds some light on how information from different sensory modalities guides social behaviour at different life stages in a salamander.


Sign in / Sign up

Export Citation Format

Share Document