scholarly journals The Influence of Induced Emotions on Distance and Size Perception and on the Grip Scaling During Grasping

2021 ◽  
Vol 12 ◽  
Author(s):  
Chuyang Sun ◽  
Juan Chen ◽  
Yuting Chen ◽  
Rixin Tang

Previous studies have shown that our perception of stimulus properties can be affected by the emotional nature of the stimulus. It is not clear, however, how emotions affect visually-guided actions toward objects. To address this question, we used toy rats, toy squirrels, and wooden blocks to induce negative, positive, and neutral emotions, respectively. Participants were asked to report the perceived distance and the perceived size of a target object resting on top of one of the three emotion-inducing objects; or to grasp the same target object either without visual feedback (open-loop) or with visual feedback (closed-loop) of both the target object and their grasping hand during the execution of grasping. We found that the target object was perceived closer and larger, but was grasped with a smaller grip aperture in the rat condition than in the squirrel and the wooden-block conditions when no visual feedback was available. With visual feedback present, this difference in grip aperture disappeared. These results showed that negative emotion influences both perceived size and grip aperture, but in opposite directions (larger perceived size but smaller grip aperture) and its influence on grip aperture could be corrected by visual feedback, which revealed different effects of emotion to perception and action. Our results have implications on the understanding of the relationship between perception and action in emotional condition, which showed the novel difference from previous theories.

2009 ◽  
Vol 102 (2) ◽  
pp. 875-885 ◽  
Author(s):  
Haleh Fotowat ◽  
Amir Fayyazuddin ◽  
Hugo J. Bellen ◽  
Fabrizio Gabbiani

Drosophila melanogaster exhibits a robust escape response to objects approaching on a collision course. Although a pair of large command interneurons called the giant fibers (GFs) have been postulated to trigger such behaviors, their role has not been directly demonstrated. Here, we show that escape from visual stimuli like those generated by approaching predators does not rely on the activation of the GFs and consists of a more complex and less stereotyped motor sequence than that evoked by the GFs. Instead, the timing of escape is tightly correlated with the activity of previously undescribed descending interneurons that signal a threshold angular size of the approaching object. The activity pattern of these interneurons shares features with those of visual escape circuits of several species, including pigeons, frogs, and locusts, and may therefore have evolved under similar constraints. These results show that visually evoked escapes in Drosophila can rely on at least two descending neuronal pathways: the GFs and the novel pathway we characterize electrophysiologically. These pathways exhibit very different patterns of sensory activity and are associated with two distinct motor programs.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Thomas F. Mathejczyk ◽  
Mathias F. Wernet

AbstractMany navigating insects include the celestial polarization pattern as an additional visual cue to orient their travels. Spontaneous orientation responses of both walking and flying fruit flies (Drosophila melanogaster) to linearly polarized light have previously been demonstrated. Using newly designed modular flight arenas consisting entirely of off-the-shelf parts and 3D-printed components we present individual flying flies with a slow and continuous rotational change in the incident angle of linear polarization. Under such open-loop conditions, single flies choose arbitrary headings with respect to the angle of polarized light and show a clear tendency to maintain those chosen headings for several minutes, thereby adjusting their course to the slow rotation of the incident stimulus. Importantly, flies show the tendency to maintain a chosen heading even when two individual test periods under a linearly polarized stimulus are interrupted by an epoch of unpolarized light lasting several minutes. Finally, we show that these behavioral responses are wavelength-specific, existing under polarized UV stimulus while being absent under polarized green light. Taken together, these findings provide further evidence supporting Drosophila’s abilities to use celestial cues for visually guided navigation and course correction.


2000 ◽  
Vol 12 (6) ◽  
pp. 950-964 ◽  
Author(s):  
Angela M. Haffenden ◽  
Melvyn A. Goodale

The present set of experiments investigated the possibility that learned perceptual information can, under certain circumstances, be utilized by visuomotor programming. In Experiment 1 (N = 28), an association was established between the color and size of square wooden blocks (e.g., red = large; yellow = small, or vice-versa). In Experiment 2 (N = 28), an association was established between the shape and size of plastic objects (e.g., hexagon = large; circle = small, or vice-versa). It was expected that the learned associations would change the perceived size of two probe objects halfway in size between the large and small objects (the probe object matched by color or shape to the large group of objects would appear smaller than the probe object matched to the small group of objects as a result of within-group relative size comparisons). In both experiments, half of the participants grasped the target objects, and the other half estimated the size of the objects by opening their thumb and finger a matching amount. For Experiment 1, it was predicted that an influence of the lérned association on the treatment of the probe objects would be seen in manual estimations and in grip scaling because the kinematics of the grasping movement were very similar across trials. As predicted, the learned association between size and color was as easily incorporated into visually guided grasping as it was into visual perceptions. In Experiment 2, it was predicted that an influence of the learned perceptual association would be seen only in manual estimations, and not in grip scaling, because the variability in target object shape from trial to trial would demand changes in precontact finger posture across trials. Despite the significant effect of the size-shape association on size estimations, no influence was seen in preparatory grip scaling, probably because varying shape increased the metrical demands on visuomotor programming from those in Experiment 1. Together, the results suggest that visuomotor programming can make use of learned size information under some, but not all, conditions.


2012 ◽  
Vol 108 (5) ◽  
pp. 1335-1348 ◽  
Author(s):  
Cynthia Poon ◽  
Lisa G. Chin-Cottongim ◽  
Stephen A. Coombes ◽  
Daniel M. Corcos ◽  
David E. Vaillancourt

It is well established that the prefrontal cortex is involved during memory-guided tasks whereas visually guided tasks are controlled in part by a frontal-parietal network. However, the nature of the transition from visually guided to memory-guided force control is not as well established. As such, this study examines the spatiotemporal pattern of brain activity that occurs during the transition from visually guided to memory-guided force control. We measured 128-channel scalp electroencephalography (EEG) in healthy individuals while they performed a grip force task. After visual feedback was removed, the first significant change in event-related activity occurred in the left central region by 300 ms, followed by changes in prefrontal cortex by 400 ms. Low-resolution electromagnetic tomography (LORETA) was used to localize the strongest activity to the left ventral premotor cortex and ventral prefrontal cortex. A second experiment altered visual feedback gain but did not require memory. In contrast to memory-guided force control, altering visual feedback gain did not lead to early changes in the left central and midline prefrontal regions. Decreasing the spatial amplitude of visual feedback did lead to changes in the midline central region by 300 ms, followed by changes in occipital activity by 400 ms. The findings show that subjects rely on sensorimotor memory processes involving left ventral premotor cortex and ventral prefrontal cortex after the immediate transition from visually guided to memory-guided force control.


2017 ◽  
Vol 4 (4) ◽  
pp. 132-136 ◽  
Author(s):  
Ting Xu ◽  
Yuan Liu ◽  
Ruijie Pan ◽  
Bin Zhang ◽  
Daqiang Yin ◽  
...  

2011 ◽  
Vol 287-290 ◽  
pp. 688-693
Author(s):  
Guo Tao ◽  
You Hui Xu ◽  
Gang Yang

The novel bis(2, 4-dihydro-2H-3-(4-N-maleimido) phenyl-1, 3-benzoxazine)isopropane (BMIPBI) was synthesized from maleic anhydride, p-nitroaniline, formaldehyde and 2,2-bis(4-hydr -oxyphenyl)propane by a few steps reactions via the N-(4-nitrophenyl)maleimide(NPMI), N-(4-amin -ophenyl)maleimide(APMI),1,3,5-three(4-(maleimido)phenyl)-1,3,5-triazine(TMIPT)intermediate production. The chemical structures of BMIPBI were confirmed by 1H-NMR, FT-IR and EA. The results show that the production was the target object BMIPBI. The synthesized condition had been studied too. The BMIPBI was a loosened yellow solid and the yield of product was 58.8%.


1998 ◽  
Vol 10 (1) ◽  
pp. 122-136 ◽  
Author(s):  
Angela M. Haffenden ◽  
Melvyn A. Goodale

The present study examined the effect of a size-contrast illusion (Ebbinghaus or Titchener Circles Illusion) on visual perception and the visual control of grasping movements. Seventeen right-handed participants picked up and, on other trials, estimated the size of fipoker-chipfl disks, which functioned as the target circles in a three-dimensional version of the illusion. In the estimation condition, subjects indicated how big they thought the target was by separating their thumb and forefinger to match the target's size. After initial viewing, no visual feedback from the hand or the target was available. Scaling of grip aperture was found to be strongly correlated with the physical size of the disks, while manual estimations of disk size were biased in the direction of the illusion. Evidently, grip aperture is calibrated to the true size of an object, even when perception of object size is distorted by a pictorial illusion, a result that is consistent with recent suggestions that visually guided prehension and visual perception are mediated by separate visual pathways.


2021 ◽  
Author(s):  
Julian R. Day-Cooney ◽  
Jackson J. Cone ◽  
John H.R. Maunsell

SummaryDuring visually guided behaviors, mere hundreds of milliseconds can elapse between a sensory input and its associated behavioral response. How spikes occurring at different times are integrated to drive perception and action remains poorly understood. We delivered random trains of optogenetic stimulation (white noise) to excite inhibitory interneurons in V1 of mice while they performed a visual detection task. We then performed a reverse correlation analysis on the optogenetic stimuli to generate a neuronal-behavioral kernel: an unbiased, temporally-precise estimate of how suppression of V1 spiking at different moments around the onset of a visual stimulus affects detection of that stimulus. Electrophysiological recordings enabled us to capture the effects of optogenetic stimuli on V1 responsivity and revealed that the earliest stimulus-evoked spikes are preferentially weighted for guiding behavior. These data demonstrate that white noise optogenetic stimulation is a powerful tool for understanding how patterns of spiking in neuronal populations are decoded in generating perception and action.


1997 ◽  
Vol 200 (9) ◽  
pp. 1281-1296 ◽  
Author(s):  
R Strauss ◽  
S Schuster ◽  
K G Götz

A computerized 360 degrees panorama allowed us to suppress most of the locomotion-induced visual feedback of a freely walking fly without neutralizing its mechanosensory system ('virtual open-loop' conditions). This novel paradigm achieves control over the fly's visual input by continuously evaluating its actual position and orientation. In experiments with natural visual feedback (closed-loop conditions), the optomotor turning induced by horizontal pattern motion in freely walking Drosophila melanogaster increased with the contrast and brightness of the stimulus. Conspicuously striped patterns were followed with variable speed but often without significant overall slippage. Using standard open-loop conditions in stationary walking flies and virtual open-loop or closed-loop conditions in freely walking flies, we compared horizontal turning induced by either horizontal or vertical motion of appropriately oriented rhombic figures. We found (i) that horizontal displacements and the horizontal-motion illusion induced by vertical displacements of the oblique edges of the rhombic figures elicited equivalent open-loop turning responses; (ii) that locomotion-induced visual feedback from the vertical edges of the rhombic figures in a stationary horizontal position diminished the closed-loop turning elicited by vertical displacements to only one-fifth of the response to horizontal displacements; and (iii) that virtual open-loop responses of mobile flies and open-loop responses of immobilized flies were equivalent in spite of delays of up to 0.1 s in the generation of the virtual stimulus. Horizontal compensatory turning upon vertical displacements of oblique edges is quantitatively consistent with the direction-selective summation of signals from an array of elementary motion detectors for the horizontal stimulus components within their narrow receptive fields. A compensation of the aperture-induced ambiguity can be excluded under these conditions. However, locomotion-induced visual feedback greatly diminished the horizontal-motion illusion in a freely walking fly. The illusion was used to assay the quality of open-loop simulation in the new paradigm.


Sign in / Sign up

Export Citation Format

Share Document