scholarly journals Decisions in motion: passive body acceleration modulates hand choice

2017 ◽  
Vol 117 (6) ◽  
pp. 2250-2261 ◽  
Author(s):  
Romy S. Bakker ◽  
Roel H. A. Weijer ◽  
Robert J. van Beers ◽  
Luc P. J. Selen ◽  
W. Pieter Medendorp

In everyday life, we frequently have to decide which hand to use for a certain action. It has been suggested that for this decision the brain calculates expected costs based on action values, such as expected biomechanical costs, expected success rate, handedness, and skillfulness. Although these conclusions were based on experiments in stationary subjects, we often act while the body is in motion. We investigated how hand choice is affected by passive body motion, which directly affects the biomechanical costs of the arm movement due to its inertia. With the use of a linear motion platform, 12 right-handed subjects were sinusoidally translated (0.625 and 0.5 Hz). At 8 possible motion phases, they had to reach, using either their left or right hand, to a target presented at 1 of 11 possible locations. We predicted hand choice by calculating the expected biomechanical costs under different assumptions about the future acceleration involved in these computations, being the forthcoming acceleration during the reach, the instantaneous acceleration at target onset, or zero acceleration as if the body were stationary. Although hand choice was generally biased to use of the dominant hand, it also modulated sinusoidally with the motion, with the amplitude of the bias depending on the motion’s peak acceleration. The phase of hand choice modulation was consistent with the cost model that took the instantaneous acceleration signal at target onset. This suggests that the brain relies on the bottom-up acceleration signals, and not on predictions about future accelerations, when deciding on hand choice during passive whole body motion. NEW & NOTEWORTHY Decisions of hand choice are a fundamental aspect of human behavior. Whereas these decisions are typically studied in stationary subjects, this study examines hand choice while subjects are in motion. We show that accelerations of the body, which differentially modulate the biomechanical costs of left and right hand movements, are also taken into account when deciding which hand to use for a reach, possibly based on bottom-up processing of the otolith signal.

2017 ◽  
Vol 118 (4) ◽  
pp. 2499-2506 ◽  
Author(s):  
A. Pomante ◽  
L. P. J. Selen ◽  
W. P. Medendorp

The vestibular system provides information for spatial orientation. However, this information is ambiguous: because the otoliths sense the gravitoinertial force, they cannot distinguish gravitational and inertial components. As a consequence, prolonged linear acceleration of the head can be interpreted as tilt, referred to as the somatogravic effect. Previous modeling work suggests that the brain disambiguates the otolith signal according to the rules of Bayesian inference, combining noisy canal cues with the a priori assumption that prolonged linear accelerations are unlikely. Within this modeling framework the noise of the vestibular signals affects the dynamic characteristics of the tilt percept during linear whole-body motion. To test this prediction, we devised a novel paradigm to psychometrically characterize the dynamic visual vertical—as a proxy for the tilt percept—during passive sinusoidal linear motion along the interaural axis (0.33 Hz motion frequency, 1.75 m/s2peak acceleration, 80 cm displacement). While subjects ( n=10) kept fixation on a central body-fixed light, a line was briefly flashed (5 ms) at different phases of the motion, the orientation of which had to be judged relative to gravity. Consistent with the model’s prediction, subjects showed a phase-dependent modulation of the dynamic visual vertical, with a subject-specific phase shift with respect to the imposed acceleration signal. The magnitude of this modulation was smaller than predicted, suggesting a contribution of nonvestibular signals to the dynamic visual vertical. Despite their dampening effect, our findings may point to a link between the noise components in the vestibular system and the characteristics of dynamic visual vertical.NEW & NOTEWORTHY A fundamental question in neuroscience is how the brain processes vestibular signals to infer the orientation of the body and objects in space. We show that, under sinusoidal linear motion, systematic error patterns appear in the disambiguation of linear acceleration and spatial orientation. We discuss the dynamics of these illusory percepts in terms of a dynamic Bayesian model that combines uncertainty in the vestibular signals with priors based on the natural statistics of head motion.


2019 ◽  
Vol 121 (6) ◽  
pp. 2392-2400 ◽  
Author(s):  
Romy S. Bakker ◽  
Luc P. J. Selen ◽  
W. Pieter Medendorp

In daily life, we frequently reach toward objects while our body is in motion. We have recently shown that body accelerations influence the decision of which hand to use for the reach, possibly by modulating the body-centered computations of the expected reach costs. However, head orientation relative to the body was not manipulated, and hence it remains unclear whether vestibular signals contribute in their head-based sensory frame or in a transformed body-centered reference frame to these cost calculations. To test this, subjects performed a preferential reaching task to targets at various directions while they were sinusoidally translated along the lateral body axis, with their head either aligned with the body (straight ahead) or rotated 18° to the left. As a measure of hand preference, we determined the target direction that resulted in equiprobable right/left-hand choices. Results show that head orientation affects this balanced target angle when the body is stationary but does not further modulate hand preference when the body is in motion. Furthermore, reaction and movement times were larger for reaches to the balanced target angle, resembling a competitive selection process, and were modulated by head orientation when the body was stationary. During body translation, reaction and movement times depended on the phase of the motion, but this phase-dependent modulation had no interaction with head orientation. We conclude that the brain transforms vestibular signals to body-centered coordinates at the early stage of reach planning, when the decision of hand choice is computed. NEW & NOTEWORTHY The brain takes inertial acceleration into account in computing the anticipated biomechanical costs that guide hand selection during whole body motion. Whereas these costs are defined in a body-centered, muscle-based reference frame, the otoliths detect the inertial acceleration in head-centered coordinates. By systematically manipulating head position relative to the body, we show that the brain transforms otolith signals into body-centered coordinates at an early stage of reach planning, i.e., before the decision of hand choice is computed.


2021 ◽  
Author(s):  
Leonie Oostwoud Wijdenes ◽  
Syanah C. Wynn ◽  
Béla S. Roesink ◽  
Dennis J.L.G. Schutter ◽  
Luc P.J. Selen ◽  
...  

AbstractBehavioral studies have shown that humans account for inertial acceleration in their decisions of hand choice when reaching during body motion. Physiologically, it is unclear at what stage of movement preparation information about body motion is integrated in the process of hand selection. Here, we addressed this question by applying transcranial magnetic stimulation over motor cortex (M1) of human participants who performed a preferential reach task while they were sinusoidally translated on a linear motion platform. If M1 only represents a read-out of the final hand choice, we expect the body motion not to affect the MEP amplitude. If body motion biases the hand selection process prior to target onset, we expect corticospinal excitability to modulate with the phase of the motion, with larger MEP amplitudes for phases that show a bias to using the right hand. Behavioral results replicate our earlier findings of a sinusoidal modulation of hand choice bias with motion phase. MEP amplitudes also show a sinusoidal modulation with motion phase, suggesting that body motion influences corticospinal excitability which may ultimately reflect changes of hand preference. The modulation being present prior to target onset suggests that competition between hands is represented throughout the corticospinal tract. Its phase relationship with the motion profile suggests that other processes after target onset take up time until the hand selection process has been completely resolved, and the reach is initiated. We conclude that the corticospinal correlates of hand preference are modulated by body motion.


1994 ◽  
Vol 6 (2) ◽  
pp. 99-116 ◽  
Author(s):  
M. W. Oram ◽  
D. I. Perrett

Cells have been found in the superior temporal polysensory area (STPa) of the macaque temporal cortex that are selectively responsive to the sight of particular whole body movements (e.g., walking) under normal lighting. These cells typically discriminate the direction of walking and the view of the body (e.g., left profile walking left). We investigated the extent to which these cells are responsive under “biological motion” conditions where the form of the body is defined only by the movement of light patches attached to the points of limb articulation. One-third of the cells (25/72) selective for the form and motion of walking bodies showed sensitivity to the moving light displays. Seven of these cells showed only partial sensitivity to form from motion, in so far as the cells responded more to moving light displays than to moving controls but failed to discriminate body view. These seven cells exhibited directional selectivity. Eighteen cells showed statistical discrimination for both direction of movement and body view under biological motion conditions. Most of these cells showed reduced responses to the impoverished moving light stimuli compared to full light conditions. The 18 cells were thus sensitive to detailed form information (body view) from the pattern of articulating motion. Cellular processing of the global pattern of articulation was indicated by the observations that none of these cells were found sensitive to movement of individual limbs and that jumbling the pattern of moving limbs reduced response magnitude. A further 10 cells were tested for sensitivity to moving light displays of whole body actions other than walking. Of these cells 5/10 showed selectivity for form displayed by biological motion stimuli that paralleled the selectivity under normal lighting conditions. The cell responses thus provide direct evidence for neural mechanisms computing form from nonrigid motion. The selectivity of the cells was for body view, specific direction, and specific type of body motion presented by moving light displays and is not predicted by many current computational approaches to the extraction of form from motion.


2019 ◽  
Vol 21 (Supplement_3) ◽  
pp. iii58-iii58
Author(s):  
J Rowlinson ◽  
P McCrorie ◽  
S Smith ◽  
D Barrett ◽  
D Kim ◽  
...  

Abstract BACKGROUND Conventional oral or intravenous chemotherapy distributes drugs to the whole body whereby systemic toxicity to healthy parts of the body (e.g. bone marrow failure) limits the maximum dose that can be achieved in the brain. This presents a particular concern for CNS tumours where the blood-brain-barrier (BBB) restricts drug influx from the circulation. The ability to deliver chemotherapy locally at the tumour site offers the opportunity to target residual cancer cells post-surgery whilst minimising systemic toxicity. We have developed a poly(lactic-co-glycolic acid)/poly(ethylene glycol) (PLGA/PEG) polymer matrix that forms a porous paste at room temperature when mixed with chemotherapy-containing saline, solidifying only at body temperature, with close apposition to the irregular surgical cavity. It is important that we can observe whether the drugs released from PLGA/PEG can penetrate brain parenchyma beyond the surgical resection margin at therapeutic doses. Currently the only way to measure the distribution of drugs in the body is to inject radioactive drugs into an animal. We aim to establish drug distribution parameters using label-free mass spectrometry imaging methods, prior to selection of drug formulations for clinically-relevant in vivo models. Drugs that penetrate the brain the furthest will be identified as good candidates for localised brain cancer drug delivery using PLGA/PEG paste. MATERIAL AND METHODS Diffusion rates were measured by examining the proportion of olaparib, dasatnib, carboplatin, etoposide, paclitaxel and gemcitabine at 2mg/ml concentration, which passes through 1mm slices of rat brain tissue within Franz cell chambers over a 6 hour period. The spatio-temporal distribution of label-free olaparib and dasatinib within mouse brain homogenate was quantitatively measured using innovative 3D OrbiSIMS, a hybrid time-of-flight / OrbitrapTM secondary ion mass spectrometer. RESULTS Within the Franz cell model, carboplatin and gemcitabine showed the highest diffusion rate diffusion at 16.4 and 6.53 µg/cm2/h respectively whereas olaparib, etoposide and paclitaxel were relatively poorly diffused at 1.87, 3.82 and 2.27 µg/cm2/h respectively. The minimum threshold of OrbiSIMS detection for label-free olaparib and dasatinib ions was 0.025 mg/ml and 0.2 mg/ml respectively throughout brain homogenate. CONCLUSION This study demonstrates different diffusion rates through brain tissue, between label-free chemotherapy drugs of distinct chemistries, with highest diffusion rates observed for carboplatin and gemcitabine. We also demonstrate label-free detection of olaparib and dasatinib using the innovative 3D OrbiSIMS method. These models will facilitate the rapid identification of agents most amenable for localised biomaterial-based chemotherapy delivery with high brain penetrance.


2019 ◽  
Vol 20 (11) ◽  
pp. 2765 ◽  
Author(s):  
Jihwan Myung ◽  
Mei-Yi Wu ◽  
Chun-Ya Lee ◽  
Amalia Ridla Rahim ◽  
Vuong Hung Truong ◽  
...  

The kidney harbors one of the strongest circadian clocks in the body. Kidney failure has long been known to cause circadian sleep disturbances. Using an adenine-induced model of chronic kidney disease (CKD) in mice, we probe the possibility that such sleep disturbances originate from aberrant circadian rhythms in kidney. Under the CKD condition, mice developed unstable behavioral circadian rhythms. When observed in isolation in vitro, the pacing of the master clock, the suprachiasmatic nucleus (SCN), remained uncompromised, while the kidney clock became a less robust circadian oscillator with a longer period. We find this analogous to the silencing of a strong slave clock in the brain, the choroid plexus, which alters the pacing of the SCN. We propose that the kidney also contributes to overall circadian timekeeping at the whole-body level, through bottom-up feedback in the hierarchical structure of the mammalian circadian clocks.


2015 ◽  
Vol 114 (6) ◽  
pp. 3211-3219 ◽  
Author(s):  
J. J. Tramper ◽  
W. P. Medendorp

It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms.


2011 ◽  
Vol 26 (S2) ◽  
pp. 935-935
Author(s):  
R. Krishnadas ◽  
A. Nicol ◽  
S. Champion ◽  
S. Pimlott ◽  
J. Stehouwer ◽  
...  

Levels of serotonin in the body are regulated by the serotonin transporters (SERT), which are predominantly located on the presynaptic terminals of serotonin-containing neurons. Alterations in the density of SERT have been implicated in the pathophysiology of many neuropsychiatric disorders.AimTo evaluate 123-I mZIENT (2(S)-[(S)-2b-carbomethoxy-3b-[3′-((Z)-2-iodoethenyl)phenyl]nortropane), a novel radiopharmaceutical for imaging SERT. The bio-distribution of the radiopharmaceutical in humans was investigated and dosimetry performed.MethodsThe study includes three healthy volunteers and three patients receiving SSRIs. Whole body images obtained on a gamma camera at 10 minutes, 1, 2, 3, 6, 24 and 48 hours post administration. Dosimetry was performed. ROIs were drawn over the brain, heart, kidneys, liver, lungs, salivary glands, spleen, thyroid and intestines. Blood was sampled at 5, 15, & 30 minutes and 1, 2, 3, 6, 24 and 48 hours post administration. Urine was collected at 1, 2, 3, 4, 6, 24 and 48 hours. Brain SPECT images were obtained using a neuroSPECT scanner at 4 hours, evaluated visually and analysed using ROI analysis.ResultsHigh quality SPECT images can be obtained after 100–150 MBq 123-ImZEINT. Regional brain uptake was observed in midbrain and basal ganglia in healthy volunteers, consistent with the known distribution of SERT. Biodistribution images demonstrated highest uptake in the lungs, brain, liver and intestines. The effective dose was within range of other commonly used ligands and is acceptable for clinical imaging.Conclusion123-ImZIENT is a promising agent for imaging SERT in humans with acceptable dosimetry.


2019 ◽  
Author(s):  
Leyla Tarhan ◽  
Talia Konkle

Humans observe a wide range of actions in their surroundings. How is the visual cortex organized to process this diverse input? Using functional neuroimaging, we measured brain responses while participants viewed short videos of everyday actions, then probed the structure in these responses using voxel-wise encoding modeling. Responses were well fit by feature spaces that capture the body parts involved in an action and the action’s targets (i.e. whether the action was directed at an object, another person, the actor, and space). Clustering analyses revealed five large-scale networks that summarized the voxel tuning: one related to social aspects of an action, and four related to the scale of the interaction envelope, ranging from fine-scale manipulations directed at objects, to large-scale whole-body movements directed at distant locations. We propose that these networks reveal the major representational joints in how actions are processed by visual regions of the brain.Significance StatementHow does the brain perceive other people’s actions? Prior work has established that much of the visual cortex is active when observing others’ actions. However, this activity reflects a wide range of processes, from identifying a movement’s direction to recognizing its social content. We investigated how these diverse processes are organized within the visual cortex. We found that five networks respond during action observation: one that is involved in processing actions’ social content, and four that are involved in processing agent-object interactions and the scale of the effect that these actions have on the world (its “interaction envelope”). Based on these findings, we propose that sociality and interaction envelope size are two of the major features that organize action perception in the visual cortex.


2021 ◽  
Vol 12 ◽  
Author(s):  
Yvan Pratviel ◽  
Veronique Deschodt-Arsac ◽  
Florian Larrue ◽  
Laurent M. Arsac

Beyond apparent simplicity, visuomotor dexterity actually requires the coordination of multiple interactions across a complex system that links the brain, the body and the environment. Recent research suggests that a better understanding of how perceptive, cognitive and motor activities cohere to form executive control could be gained from multifractal formalisms applied to movement behavior. Rather than a central executive “talking” to encapsuled components, the multifractal intuition suggests that eye-hand coordination arises from multiplicative cascade dynamics across temporal scales of activity within the whole system, which is reflected in movement time series. Here we examined hand movements of sport students performing a visuomotor task in virtual reality (VR). The task involved hitting spatially arranged targets that lit up on a virtual board under critical time pressure. Three conditions were compared where the visual search field changed: whole board (Standard), half-board lower view field (LVF) and upper view field (UVF). Densely sampled (90 Hz) time series of hand motions captured by VR controllers were analyzed by a focus-based multifractal detrended fluctuation analysis (DFA). Multiplicative rather than additive interactions across temporal scales were evidenced by testing comparatively phase-randomized surrogates of experimental series, which confirmed nonlinear processes. As main results, it was demonstrated that: (i) the degree of multifractality in hand motion behavior was minimal in LVF, a familiar visual search field where subjects correlatively reached their best visuomotor response times (RTs); (ii) multifractality increased in the less familiar UVF, but interestingly only for the non-dominant hand; and (iii) multifractality increased further in Standard, for both hands indifferently; in Standard, the maximal expansion of the visual search field imposed the highest demand as evidenced by the worst visuomotor RTs. Our observations advocate for visuomotor dexterity best described by multiplicative cascades dynamics and a system-wide distributed control rather than a central executive. More importantly, multifractal metrics obtained from hand movements behavior, beyond the confines of the brain, offer a window on the fine organization of control architecture, with high sensitivity to hand-related control behavior under specific constraints. Appealing applications may be found in movement learning/rehabilitation, e.g., in hemineglect people, stroke patients, maturing children or athletes.


Sign in / Sign up

Export Citation Format

Share Document