action goal
Recently Published Documents


TOTAL DOCUMENTS

30
(FIVE YEARS 8)

H-INDEX

6
(FIVE YEARS 2)

2021 ◽  
pp. JN-RM-1522-21
Author(s):  
RC Lapate ◽  
IC Ballard ◽  
MK Heckner ◽  
M D’Esposito

NeuroImage ◽  
2021 ◽  
pp. 118511
Author(s):  
Antonino Errante ◽  
Settimio Ziccarelli ◽  
Gloria Mingolla ◽  
Leonardo Fogassi
Keyword(s):  

2021 ◽  
Author(s):  
Regina C Lapate ◽  
Ian C Ballard ◽  
Marisa K Heckner ◽  
Mark D'Esposito

Emotional states provide an ever-present source of contextual information that should inform behavioral goals. Despite the ubiquity of emotional signals in our environment, the neural mechanisms underlying their influence on goal-directed action remains unclear. Prior work suggests that the lateral frontal pole (FPl) is uniquely positioned to integrate affective information into cognitive control representations. We used pattern similarity analysis to examine the content of representations in FPl and interconnected mid-lateral prefrontal and amygdala circuitry. Healthy participants (n=37; n=21 females) were scanned while undergoing an event-related Affective Go/No-Go task, which requires goal-oriented action selection during emotional processing. We found that FPl contained conjunctive emotion-action goal representations that were related to successful cognitive control during emotional processing. These representations differed from conjunctive emotion-action goal representations found in the basolateral amygdala. While robust action goal representations were present in mid-lateral prefrontal cortex, they were not modulated by emotional valence. Finally, converging results from functional connectivity and multivoxel pattern analyses indicated that FPl's emotional valence signals likely originated from interconnected subgenual ACC (BA25), which was in turn functionally coupled with the amygdala. Thus, our results identify a key pathway by which internal emotional states influence goal-directed behavior.


2021 ◽  
Author(s):  
Christian Gumbsch ◽  
Maurits Adam ◽  
Birgit Elsner ◽  
Martin V. Butz

From about six months of age onwards, infants start to reliably fixate the goal of an observed action, such as a grasp, before the action is complete. The available research has identified a variety of factors that influence such goal-anticipatory gaze shifts, including the experience with the shown action events and familiarity with the observed agents. However, the underlying cognitive processes are still heavily debated. We propose that our minds (i) tend to structure sensorimotor dynamics into probabilistic, generative event- and event-boundary-predictive models, and, meanwhile, (ii) choose actions with the objective to minimize predicted uncertainty. We implement this proposition by means of event-predictive learning and active inference. The implemented learning mechanism induces an inductive, event-predictive bias, thus developing schematic encodings of experienced events and event boundaries. The implemented active inference principle chooses actions by aiming at minimizing expected future uncertainty. We train our system on multiple object-manipulation events. As a result, the generation of goal-anticipatory gaze shifts emerges while learning about object manipulations: the model starts fixating the inferred goal already at the start of an observed event after having sampled some experience with possible events and when a familiar agent (i.e., a hand) is involved. Meanwhile, the model keeps reactively tracking an unfamiliar agent (i.e a mechanical claw) that is performing the same movement. We conclude that event-predictive learning combined with active inference may be critical for eliciting infant action-goal prediction.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Sam Clarke ◽  
Luke McEllin ◽  
Anna Francová ◽  
Marcell Székely ◽  
Stephen A. Butterfill ◽  
...  

Abstract Joint actions often require agents to track others’ actions while planning and executing physically incongruent actions of their own. Previous research has indicated that this can lead to visuomotor interference effects when it occurs outside of joint action. How is this avoided or overcome in joint actions? We hypothesized that when joint action partners represent their actions as interrelated components of a plan to bring about a joint action goal, each partner’s movements need not be represented in relation to distinct, incongruent proximal goals. Instead they can be represented in relation to a single proximal goal – especially if the movements are, or appear to be, mechanically linked to a more distal joint action goal. To test this, we implemented a paradigm in which participants produced finger movements that were either congruent or incongruent with those of a virtual partner, and either with or without a joint action goal (the joint flipping of a switch, which turned on two light bulbs). Our findings provide partial support for the hypothesis that visuomotor interference effects can be reduced when two physically incongruent actions are represented as mechanically interdependent contributions to a joint action goal.


eLife ◽  
2019 ◽  
Vol 8 ◽  
Author(s):  
Lalitta Suriya-Arunroj ◽  
Alexander Gail

Prior expectations of movement instructions can promote preliminary action planning and influence choices. We investigated how action priors affect action-goal encoding in premotor and parietal cortices and if they bias subsequent free choice. Monkeys planned reaches according to visual cues that indicated relative probabilities of two possible goals. On instructed trials, the reach goal was determined by a secondary cue respecting these probabilities. On rarely interspersed free-choice trials without instruction, both goals offered equal reward. Action priors induced graded free-choice biases and graded frontoparietal motor-goal activity, complementarily in two subclasses of neurons. Down-regulating neurons co-encoded both possible goals and decreased opposite-to-preferred responses with decreasing prior, possibly supporting a process of choice by elimination. Up-regulating neurons showed increased preferred-direction responses with increasing prior, likely supporting a process of computing net likelihood. Action-selection signals emerged earliest in down-regulating neurons of premotor cortex, arguing for an initiation of selection in the frontal lobe.


2019 ◽  
Vol 45 (8) ◽  
pp. 1441-1454 ◽  
Author(s):  
Frank Papenmeier ◽  
Annika Boss ◽  
Anne-Kathrin Mahlke
Keyword(s):  

2018 ◽  
Author(s):  
Sam Clarke ◽  
Anna Francová ◽  
Marcell Székely ◽  
Stephen Butterfill ◽  
John Michael

Joint actions often require agents to track others’ actions while planning and executing physically incongruent actions of their own. Previous research has indicated that this can lead to visuomotor interference effects when it occurs outside of joint action. How is this avoided or overcome in joint actions? We hypothesized that when joint action partners represent their actions as interrelated components of a plan to bring about a joint action goal, each partner’s movements need not be represented in relation to distinct, incongruent proximal goals. Instead they can be represented in relation to a single proximal goal – especially if the movements are, or appear to be, mechanically linked to a more distal joint action goal. To test this, we implemented a paradigm in which participants produced finger movements that were either congruent or incongruent with those of a virtual partner, and either with or without a joint action goal (the joint flipping of a switch, which turned on two light bulbs). Our findings provide partial support for the hypothesis that visuomotor interference effects may be reduced when two physically incongruent actions can be represented as mechanically interdependent contributions to a joint action goal.


2018 ◽  
Vol 72 (7) ◽  
pp. 1756-1770 ◽  
Author(s):  
Paul AG Forbes ◽  
Steph F Suddell ◽  
Harry Farmer ◽  
Yanakan Logeswaran ◽  
Antonia F de C Hamilton

Whether pointing at a menu item or rifling through a clothes rack, when we choose we often move. We investigated whether people’s tendency to copy the movements of others could influence their choices. Participants saw pairs of pictures in private and indicated which one they preferred. They then entered a virtual art gallery and saw the same picture pairs in the presence of a virtual character. Having observed the virtual character point to indicate her preference with either a high or low movement trajectory, participants indicated their preference. There was either an anatomical (same movement, same choice) or spatial correspondence (same movement, different choice) between the participant’s pictures and those of the virtual character. We found that participants copied the movement made by the virtual character rather than her action goal (i.e., her choice of picture). This resulted in a shift towards the virtual character’s preferences in the anatomical condition but away from her preferences in the spatial condition. This effect was driven by the observation of the virtual character’s high pointing movements. In a further experiment, we did not find any significant differences in imitation behaviour in autism, although autistic participants were less consistent in their choices. Our findings demonstrate that we are not only influenced by other’s choices but also the types of movements others make to indicate those choices.


Sign in / Sign up

Export Citation Format

Share Document