Vibrissal Location Coding

2015 ◽  
pp. 725-735 ◽  
Author(s):  
Ehud Ahissar ◽  
Per M Knutsen
Keyword(s):  
2019 ◽  
Vol 30 (3) ◽  
pp. 1779-1796 ◽  
Author(s):  
Mikiko Kadohisa ◽  
Kei Watanabe ◽  
Makoto Kusunoki ◽  
Mark J Buckley ◽  
John Duncan

Abstract Complex cognition is dynamic, with each stage of a task requiring new cognitive processes appropriately linked to stimulus or other content. To investigate control over successive task stages, we recorded neural activity in lateral frontal and parietal cortex as monkeys carried out a complex object selection task, with each trial separated into phases of visual selection and learning from feedback. To study capacity limitation, complexity was manipulated by varying the number of object targets to be learned in each problem. Different task phases were associated with quasi-independent patterns of activity and information coding, with no suggestion of sustained activity linked to a current target. Object and location coding were largely parallel in frontal and inferior parietal cortex, though frontal cortex showed somewhat stronger object representation at feedback, and more sustained location coding at choice. At both feedback and choice, coding precision diminished as task complexity increased, matching a decline in performance. We suggest that, across successive task steps, there is radical but capacity-limited reorganization of frontoparietal activity, selecting different cognitive operations linked to their current targets.


1998 ◽  
Vol 13 (2) ◽  
pp. 185-200 ◽  
Author(s):  
Nora Newcombe ◽  
Janellen Huttenlocher ◽  
Anna Bullock Drummey ◽  
Judith G. Wiley

2016 ◽  
Vol 115 (4) ◽  
pp. 2237-2245 ◽  
Author(s):  
Hannah M. Krüger ◽  
Thérèse Collins ◽  
Bernhard Englitz ◽  
Patrick Cavanagh

Orienting our eyes to a light, a sound, or a touch occurs effortlessly, despite the fact that sound and touch have to be converted from head- and body-based coordinates to eye-based coordinates to do so. We asked whether the oculomotor representation is also used for localization of sounds even when there is no saccade to the sound source. To address this, we examined whether saccades introduced similar errors of localization judgments for both visual and auditory stimuli. Sixteen subjects indicated the direction of a visual or auditory apparent motion seen or heard between two targets presented either during fixation or straddling a saccade. Compared with the fixation baseline, saccades introduced errors in direction judgments for both visual and auditory stimuli: in both cases, apparent motion judgments were biased in direction of the saccade. These saccade-induced effects across modalities give rise to the possibility of shared, cross-modal location coding for perception and action.


1980 ◽  
Vol 16 (10) ◽  
pp. 361 ◽  
Author(s):  
M.G.B. Ismail ◽  
R. Steele
Keyword(s):  

2020 ◽  
Vol 189 ◽  
pp. 104703
Author(s):  
Qingfen Hu ◽  
Meng Zhang ◽  
Yi Shao ◽  
Ganzhen Feng
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document