self motion
Recently Published Documents


TOTAL DOCUMENTS

1244
(FIVE YEARS 279)

H-INDEX

57
(FIVE YEARS 7)

2022 ◽  
Vol 13 (1) ◽  
Author(s):  
Isabelle Mackrous ◽  
Jérome Carriot ◽  
Kathleen E. Cullen

AbstractThe vestibular system detects head motion to coordinate vital reflexes and provide our sense of balance and spatial orientation. A long-standing hypothesis has been that projections from the central vestibular system back to the vestibular sensory organs (i.e., the efferent vestibular system) mediate adaptive sensory coding during voluntary locomotion. However, direct proof for this idea has been lacking. Here we recorded from individual semicircular canal and otolith afferents during walking and running in monkeys. Using a combination of mathematical modeling and nonlinear analysis, we show that afferent encoding is actually identical across passive and active conditions, irrespective of context. Thus, taken together our results are instead consistent with the view that the vestibular periphery relays robust information to the brain during primate locomotion, suggesting that context-dependent modulation instead occurs centrally to ensure that coding is consistent with behavioral goals during locomotion.


2022 ◽  
pp. 1-29
Author(s):  
Andrew R. Wagner ◽  
Megan J. Kobel ◽  
Daniel M. Merfeld

Abstract In an effort to characterize the factors influencing the perception of self-motion rotational cues, vestibular self-motion perceptual thresholds were measured in 14 subjects for rotations in the roll and pitch planes, as well as in the planes aligned with the anatomic orientation of the vertical semicircular canals (i.e., left anterior, right posterior; LARP, and right anterior, left posterior; RALP). To determine the multisensory influence of concurrent otolith cues, within each plane of motion, thresholds were measured at four discrete frequencies for rotations about earth-horizontal (i.e., tilts; EH) and earth-vertical axes (i.e., head positioned in the plane of the rotation; EV). We found that the perception of rotations, stimulating primarily the vertical canals, was consistent with the behavior of a high-pass filter for all planes of motion, with velocity thresholds increasing at lower frequencies of rotation. In contrast, tilt (i.e, EH rotation) velocity thresholds, stimulating both the canals and otoliths (i.e., multisensory integration), decreased at lower frequencies and were significantly lower than earth-vertical rotation thresholds at each frequency below 2 Hz. These data suggest that multisensory integration of otolithic gravity cues with semicircular canal rotation cues enhances perceptual precision for tilt motions at frequencies below 2 Hz. We also showed that rotation thresholds, at least partially, were dependent on the orientation of the rotation plane relative to the anatomical alignment of the vertical canals. Collectively these data provide the first comprehensive report of how frequency and axis of rotation influence perception of rotational self-motion cues stimulating the vertical canals.


Cortex ◽  
2022 ◽  
Author(s):  
Shir Shalom-Sperber ◽  
Aihua Chen ◽  
Adam Zaidel

Conatus ◽  
2021 ◽  
Vol 6 (2) ◽  
pp. 87
Author(s):  
Justin Humphreys

Descartes holds that, insofar as nature is a purposeless, unthinking, extended substance, there could be no final causes in physics. Descartes’ derivation of his three laws of motion from the perfections of God thus underwrites a rejection of Aristotle’s conception of natural self-motion and teleology. Aristotle derived his conception of the purposeful action of sublunar creatures from his notion that superlunar bodies are perfect, eternal, living beings, via the thesis that circular motion is more complete or perfect than rectilinear motion. Descartes’ reduction of circular motion to rectilinear motion, achieved through his theological foundation of the laws of motion, thus marks a crucial break from Aristotle’s philosophy of nature. This paper argues that the shift from the Aristotelian conception of nature as self-moving and teleological to the Cartesian conception of nature as purposeless and inert, is not an empirical discovery but is rooted in differing conceptions of where perfection lies in nature.


2021 ◽  
Author(s):  
Chinmay Purandare ◽  
Shonali Dhingra ◽  
Rodrigo Rios ◽  
Cliff Vuong ◽  
Thuc To ◽  
...  

Visual cortical neurons encode the position and motion direction of specific stimuli retrospectively, without any locomotion or task demand. Hippocampus, a part of visual system, is hypothesized to require self-motion or cognitive task to generate allocentric spatial selectivity that is scalar, abstract, and prospective. To bridge these seeming disparities, we measured rodent hippocampal selectivity to a moving bar of light in a body-fixed rat. About 70% of dorsal CA1 neurons showed stable activity modulation as a function of the bar angular position, independent of behavior and rewards. A third of tuned cells also encoded the direction of revolution. In other experiments, neurons encoded the distance of the bar, with preference for approaching motion. Collectively, these demonstrate visually evoked vectorial selectivity (VEVS). Unlike place cells, VEVS was retrospective. Changes in the visual stimulus or its trajectory did not cause remapping but only caused gradual changes. Most VEVS-tuned neurons behaved like place cells during spatial exploration and the two selectivities were correlated. Thus, VEVS could form the basic building block of hippocampal activity. When combined with self-motion, reward, or multisensory stimuli, it can generate the complexity of prospective representations including allocentric space, time, and episodes.


2021 ◽  
Vol 12 (1) ◽  
pp. 173
Author(s):  
Akio Honda ◽  
Kei Maeda ◽  
Shuichi Sakamoto ◽  
Yôiti Suzuki

The deterioration of sound localization accuracy during a listener’s head/body rotation is independent of the listener’s rotation velocity (Honda et al., 2016). However, whether this deterioration occurs only during physical movement in a real environment remains unclear. In this study, we addressed this question by subjecting physically stationary listeners to visually induced self-motion, i.e., vection. Two conditions—one with a visually induced perception of self-motion (vection) and the other without vection (control)—were adopted. Under both conditions, a short noise burst (30 ms) was presented via a loudspeaker in a circular array placed horizontally in front of a listener. The listeners were asked to determine whether the acoustic stimulus was localized relative to their subjective midline. The results showed that in terms of detection thresholds based on the subjective midline, the sound localization accuracy was lower under the vection condition than under the control condition. This indicates that sound localization can be compromised under visually induced self-motion perception. These findings support the idea that self-motion information is crucial for auditory space perception and can potentially enable the design of dynamic binaural displays requiring fewer computational resources.


2021 ◽  
pp. 1-15
Author(s):  
Junchen Wang ◽  
Chunheng Lu ◽  
Yinghao Zhang ◽  
Zhen Sun ◽  
Yu Shen

Abstract This paper presents a numerically stable algorithm for analytic inverse kinematics of 7-DoF S-R-S manipulators with joint limit avoidance. The arm angle is used to represent the self-motion manifold within a global arm configuration. The joint limits are analytically mapped to the arm angle space for joint limit avoidance. To profile the relation between the joint angle and arm angle, it is critical to characterize the singular arm angle for each joint. In the-state-of-the art methods, the existence of the singular arm angle is triggered by comparing a discriminant with zero given a threshold. We will show this leads to numerical issues since the threshold is inconsistent among different target poses, leading to incorrect range of the arm angle. These issues are overcome by associating indeterminate joint angles of tangent joints with angles of 0 or pi of cosine joints, rather than using an independent threshold for each joint. The closed-form algorithm in C++ code to perform numerically stable inverse kinematics of 7-DoF S-R-S manipulators with global arm configuration control and joint limit avoidance is also given.


Robotics ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. 1
Author(s):  
Omar W. Maaroof ◽  
Mehmet İsmet Can Dede ◽  
Levent Aydin

Redundancy resolution techniques have been widely used for the control of kinematically redundant robots. In this work, one of the redundancy resolution techniques is employed in the mechanical design optimization of a robot arm. Although the robot arm is non-redundant, the proposed method modifies robot arm kinematics by adding virtual joints to make the robot arm kinematically redundant. In the proposed method, a suitable objective function is selected to optimize the robot arm’s kinematic parameters by enhancing one or more performance indices. Then the robot arm’s end-effector is fixed at critical positions while the redundancy resolution algorithm moves its joints including the virtual joints because of the self-motion of a redundant robot. Hence, the optimum values of the virtual joints are determined, and the design of the robot arm is modified accordingly. An advantage of this method is the visualization of the changes in the manipulator’s structure during the optimization process. In this work, as a case study, a passive robotic arm that is used in a surgical robot system is considered and the task is defined as the determination of the optimum base location and the first link’s length. The results indicate the effectiveness of the proposed method.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0261266
Author(s):  
Maëlle Tixier ◽  
Stéphane Rousset ◽  
Pierre-Alain Barraud ◽  
Corinne Cian

A large body of research has shown that visually induced self-motion (vection) and cognitive processing may interfere with each other. The aim of this study was to assess the interactive effects of a visual motion inducing vection (uniform motion in roll) versus a visual motion without vection (non-uniform motion) and long-term memory processing using the characteristics of standing posture (quiet stance). As the level of interference may be related to the nature of the cognitive tasks used, we examined the effect of visual motion on a memory task which requires a spatial process (episodic recollection) versus a memory task which does not require this process (semantic comparisons). Results confirm data of the literature showing that compensatory postural response in the same direction as background motion. Repeatedly watching visual uniform motion or increasing the cognitive load with a memory task did not decrease postural deviations. Finally, participants were differentially controlling their balance according to the memory task but this difference was significant only in the vection condition and in the plane of background motion. Increased sway regularity (decreased entropy) combined with decreased postural stability (increase variance) during vection for the episodic task would indicate an ineffective postural control. The different interference of episodic and semantic memory on posture during visual motion is consistent with the involvement of spatial processes during episodic memory recollection. It can be suggested that spatial disorientation due to visual roll motion preferentially interferes with spatial cognitive tasks, as spatial tasks can draw on resources expended to control posture.


2021 ◽  
pp. 1-17
Author(s):  
Iqra Arshad ◽  
Paulo De Mello ◽  
Martin Ender ◽  
Jason D. McEwen ◽  
Elisa R. Ferré

Abstract Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness. We evaluated whether a new Artificial Intelligence (AI) software designed to supplement the 360-degree VR experience with artificial six-degrees-of-freedom motion may reduce cybersickness. Explicit (simulator sickness questionnaire and Fast Motion Sickness (FMS) rating) and implicit (heart rate) measurements were used to evaluate cybersickness symptoms during and after 360-degree VR exposure. Simulator sickness scores showed a significant reduction in feelings of nausea during the AI-supplemented six-degrees-of-freedom motion VR compared to traditional 360-degree VR. However, six-degrees-of-freedom motion VR did not reduce oculomotor or disorientation measures of sickness. No changes were observed in FMS and heart rate measures. Improving the congruency between visual and vestibular cues in 360-degree VR, as provided by the AI-supplemented six-degrees-of-freedom motion system considered, is essential for a more engaging, immersive and safe VR experience, which is critical for educational, cultural and entertainment applications.


Sign in / Sign up

Export Citation Format

Share Document