scholarly journals American Crow Brain Activity in Response to Conspecific Vocalizations Changes When Food Is Present

2021 ◽  
Vol 12 ◽  
Author(s):  
LomaJohn T. Pendergraft ◽  
John M. Marzluff ◽  
Donna J. Cross ◽  
Toru Shimizu ◽  
Christopher N. Templeton

Social interaction among animals can occur under many contexts, such as during foraging. Our knowledge of the regions within an avian brain associated with social interaction is limited to the regions activated by a single context or sensory modality. We used 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) to examine American crow (Corvus brachyrhynchos) brain activity in response to conditions associated with communal feeding. Using a paired approach, we exposed crows to either a visual stimulus (the sight of food), an audio stimulus (the sound of conspecifics vocalizing while foraging) or both audio/visual stimuli presented simultaneously and compared to their brain activity in response to a control stimulus (an empty stage). We found two regions, the nucleus taenia of the amygdala (TnA) and a medial portion of the caudal nidopallium, that showed increased activity in response to the multimodal combination of stimuli but not in response to either stimulus when presented unimodally. We also found significantly increased activity in the lateral septum and medially within the nidopallium in response to both the audio-only and the combined audio/visual stimuli. We did not find any differences in activation in response to the visual stimulus by itself. We discuss how these regions may be involved in the processing of multimodal stimuli in the context of social interaction.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Nienke B. Debats ◽  
Herbert Heuer ◽  
Christoph Kayser

AbstractTo organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one visual stimulus, hence paving the way to study multisensory perception under more naturalistic settings with multiple signals per sensory modality. We used a cursor-control task in which proprioceptive information on the endpoint of a reaching movement was complemented by two visual stimuli providing additional information on the movement endpoint. The visual stimuli were briefly shown, one synchronously with the hand reaching the movement endpoint, the other delayed. In Experiment 1, the judgments of hand movement endpoint revealed integration and recalibration biases oriented towards the position of the synchronous stimulus and away from the delayed one. In Experiment 2 we contrasted two alternative accounts: that only the temporally more proximal visual stimulus enters integration similar to a winner-takes-all process, or that the influences of both stimuli superpose. The proprioceptive biases revealed that integration—and likely also recalibration—are shaped by the superposed contributions of multiple stimuli rather than by only the most powerful individual one.


2021 ◽  
pp. 2150042
Author(s):  
Mirra Soundirarajan ◽  
Ondrej Krejcar ◽  
Hamidreza Namazi

Since the brain regulates our facial reactions, there should be a relationship between their activities. Moving (dynamic) visual stimuli are an important type of visual stimuli that we are dealing with in our daily life. Since EMG and EEG signals contain information, we evaluated the coupling of the reactions of facial muscles and brain to various moving visual stimuli by analysis of the embedded information in these signals. We benefited from Shannon entropy to quantify the information. The results showed that a decrement in the information of visual stimulus is mapped on a decrement of the information of EMG and EEG signals, and therefore, the activities of facial muscles and the brain are correlated (Pearson correlation [Formula: see text]). Besides, the analysis of the Hurst exponent of EEG signals demonstrated that increasing the information of EEG signals causes the increment in its memory. This method can also be used to evaluate the coupling among other organs’ activity and brain activity by analysis of related physiological signals.


2021 ◽  
Vol 9 (7_suppl3) ◽  
pp. 2325967121S0013
Author(s):  
Manish Anand ◽  
Jed A. Diekfuss ◽  
Dustin R. Grooms ◽  
Alexis B. Slutsky-Ganesh ◽  
Scott Bonnette ◽  
...  

Background: Aberrant frontal and sagittal plane knee motor control biomechanics contribute to increased anterior cruciate ligament (ACL) injury risk. Emergent data further indicates alterations in brain function may underlie ACL injury high risk biomechanics and primary injury. However, technical limitations have limited our ability to assess direct linkages between maladaptive biomechanics and brain function. Hypothesis/Purpose: (1) Increased frontal plane knee range of motion would associate with altered brain activity in regions important for sensorimotor control and (2) increased sagittal plane knee motor control timing error would associate with altered activity in sensorimotor control brain regions. Methods: Eighteen female high-school basketball and volleyball players (14.7 ± 1.4 years, 169.5 ± 7 cm, 65.8 ± 20.5 kg) underwent brain functional magnetic resonance imaging (fMRI) while performing a bilateral, combined hip, knee, and ankle flexion/extension movements against resistance (i.e., leg press) Figure 1(a). The participants completed this task to a reference beat of 1.2 Hz during four movement blocks of 30 seconds each interleaved in between 5 rest blocks of 30 seconds each. Concurrent frontal and sagittal plane range of motion (ROM) kinematics were measured using an MRI-compatible single camera motion capture system. Results: Increased frontal plane ROM was associated with increased brain activity in one cluster extending over the occipital fusiform gyrus and lingual gyrus ( p = .003, z > 3.1). Increased sagittal plane motor control timing error was associated with increased brain activity in multiple clusters extending over the occipital cortex (lingual gyrus), frontal cortex, and anterior cingulate cortex ( p < .001, z > 3.1); see Figure 1 (b). Conclusion: The associations of increased knee frontal plane ROM and sagittal plane timing error with increased activity in regions that integrate visuospatial information may be indicative of an increased propensity for knee injury biomechanics that are, in part, driven by reduced spatial awareness and an inability to adequately control knee abduction motion. Increased activation in these regions during movement tasks may underlie an impaired ability to control movements (i.e., less neural efficiency), leading to compromised knee positions during more complex sports scenarios. Increased activity in regions important for cognition/attention associating with motor control timing error further indicates a neurologically inefficient motor control strategy. [Figure: see text]


1992 ◽  
Vol 67 (6) ◽  
pp. 1447-1463 ◽  
Author(s):  
K. Nakamura ◽  
A. Mikami ◽  
K. Kubota

1. The activity of single neurons was recorded extracellularly from the monkey amygdala while monkeys performed a visual discrimination task. The monkeys were trained to remember a visual stimulus during a delay period (0.5-3.0 s), to discriminate a new visual stimulus from the stimulus, and to release a lever when the new stimulus was presented. Colored photographs (human faces, monkeys, foods, and nonfood objects) or computer-generated two-dimensional shapes (a yellow triangle, a red circle, etc.) were used as visual stimuli. 2. The activity of 160 task-related neurons was studied. Of these, 144 (90%) responded to visual stimuli, 13 (8%) showed firing during the delay period, and 9 (6%) responded to the reward. 3. Task-related neurons were categorized according to the way in which various stimuli activated the neurons. First, to evaluate the proportion of all tested stimuli that elicited changes in activity of a neuron, selectivity index 1 (SI1) was employed. Second, to evaluate the ability of a neuron to discriminate a stimulus from another stimulus, SI2 was employed. On the basis of the calculated values of SI1 and SI2, neurons were classified as selective and nonselective. Most visual neurons were categorized as selective (131/144), and a few were characterized as nonselective (13/144). Neurons active during the delay period were also categorized as selective visual and delay neurons (6/13) and as nonselective delay neurons (7/13). 4. Responses of selective visual neurons had various temporal and stimulus-selective properties. Latencies ranged widely from 60 to 300 ms. Response durations also ranged widely from 20 to 870 ms. When the natures of the various effective stimuli were studied for each neuron, one-fourth of the responses of these neurons were considered to reflect some categorical aspect of the stimuli, such as human, monkey, food, or nonfood object. Furthermore, the responses of some neurons apparently reflected a certain behavioral significance of the stimuli that was separate from the task, such as the face of a particular person, smiling human faces, etc. 5. Nonselective visual neurons responded to a visual stimulus, regardless of its nature. They also responded in the absence of a visual stimulus when the monkey anticipated the appearance of the next stimulus. 6. Selective visual and delay neurons fired in response to particular stimuli and throughout the subsequent delay periods. Nonselective delay neurons increased their discharge rates gradually during the delay period, and the discharge rate decreased after the next stimulus was presented. 7. Task-related neurons were identified in six histologically distinct nuclei of the amygdala.(ABSTRACT TRUNCATED AT 400 WORDS)


1995 ◽  
Vol 12 (4) ◽  
pp. 723-741 ◽  
Author(s):  
W. Guido ◽  
S.-M. Lu ◽  
J.W. Vaughan ◽  
Dwayne W. Godwin ◽  
S. Murray Sherman

AbstractRelay cells of the lateral geniculate nucleus respond to visual stimuli in one of two modes: burst and tonic. The burst mode depends on the activation of a voltage-dependent, Ca2+ conductance underlying the low threshold spike. This conductance is inactivated at depolarized membrane potentials, but when activated from hyperpolarized levels, it leads to a large, triangular, nearly all-or-none depolarization. Typically, riding its crest is a high-frequency barrage of action potentials. Low threshold spikes thus provide a nonlinear amplification allowing hyperpolarized relay neurons to respond to depolarizing inputs, including retinal EPSPs. In contrast, the tonic mode is characterized by a steady stream of unitary action potentials that more linearly reflects the visual stimulus. In this study, we tested possible differences in detection between response modes of 103 geniculate neurons by constructing receiver operating characteristic (ROC) curves for responses to visual stimuli (drifting sine-wave gratings and flashing spots). Detectability was determined from the ROC curves by computing the area under each curve, known as the ROC area. Most cells switched between modes during recording, evidently due to small shifts in membrane potential that affected the activation state of the low threshold spike. We found that the more often a cell responded in burst mode, the larger its ROC area. This was true for responses to optimal and nonoptimal visual stimuli, the latter including nonoptimal spatial frequencies and low stimulus contrasts. The larger ROC areas associated with burst mode were due to a reduced spontaneous activity and roughly equivalent level of visually evoked response when compared to tonic mode. We performed a within-cell analysis on a subset of 22 cells that switched modes during recording. Every cell, whether tested with a low contrast or high contrast visual stimulus exhibited a larger ROC area during its burst response mode than during its tonic mode. We conclude that burst responses better support signal detection than do tonic responses. Thus, burst responses, while less linear and perhaps less useful in providing a detailed analysis of visual stimuli, improve target detection. The tonic mode, with its more linear response, seems better suited for signal analysis rather than signal detection.


1996 ◽  
Vol 76 (3) ◽  
pp. 1439-1456 ◽  
Author(s):  
P. Mazzoni ◽  
R. M. Bracewell ◽  
S. Barash ◽  
R. A. Andersen

1. The lateral intraparietal area (area LIP) of the monkey's posterior parietal cortex (PPC) contains neurons that are active during saccadic eye movements. These neurons' activity includes visual and saccade-related components. These responses are spatially tuned and the location of a neuron's visual receptive field (RF) relative to the fovea generally overlaps its preferred saccade amplitude and direction (i.e., its motor field, MF). When a delay is imposed between the presentation of a visual stimulus and a saccade made to its location (memory saccade task), many LIP neurons maintain elevated activity during the delay (memory activity, M), which appears to encode the metrics of the next intended saccadic eye movements. Recent studies have alternatively suggested that LIP neurons encode the locations of visual stimuli regardless of where the animal intends to look. We examined whether the M activity of LIP neurons specifically encodes movement intention or the locations of recent visual stimuli, or a combination of both. In the accompanying study, we investigated whether the intended-movement activity reflects changes in motor plan. 2. We trained monkeys (Macaca mulatta) to memorize the locations of two visual stimuli and plan a sequence of two saccades, one to each remembered target, as we recorded the activity of single LIP neurons. Two targets were flashed briefly while the monkey maintained fixation; after a delay the fixation point was extinguished, and the monkey made two saccades in sequence to each target's remembered location, in the order in which the targets were presented. This "delayed double saccade" (DDS) paradigm allowed us to dissociate the location of visual stimulation from the direction of the planned saccade and thus distinguish neuronal activity related to the target's location from activity related to the saccade plan. By imposing a delay, we eliminated the confounding effect of any phasic responses coincident with the appearance of the stimulus and with the saccade. 3. We arranged the two visual stimuli so that in one set of conditions at least the first one was in the neuron's visual RF, and thus the first saccade was in the neuron's motor field (MF). M activity should be high in these conditions according to both the sensory memory and motor plan hypotheses. In another set of conditions, the second stimulus appeared in the RF but the first one was presented outside the RF, instructing the monkey to plan the first saccade away from the neuron's MF. If the M activity encodes the motor plan, it should be low in these conditions, reflecting the plan for the first saccade (away from the MF). If it is a sensory trace of the stimulus' location, it should be high, reflecting stimulation of the RF by the second target. 4. We tested 49 LIP neurons (in 3 hemispheres of 2 monkeys) with M activity on the DDS task. Of these, 38 (77%) had M activity related to the next intended saccade. They were active in the delay period, as expected, if the first saccade was in their preferred direction. They were less active or silent if the next saccade was not in their preferred direction, even when the second stimulus appeared in their RF. 5. The M activity of 8 (16%) of the remaining neurons specifically encoded the location of the most recent visual stimulus. Their firing rate during the delay reflected stimulation of the RF independently of the saccade being planned. The remaining 3 neurons had M activity that did not consistently encode either the next saccade or the stimulus' location. 6. We also recorded the activity of a subset of neurons (n = 38) in a condition in which no stimulus appeared in a neuron's RF, but the second saccade was in the neuron's MF. In this case the majority of neurons tested (23/38, 60%) became active in the period between the first and second saccade, even if neither stimulus had appeared in their RF. Moreover, this activity appeared only after the first saccade had started in all but two of


2021 ◽  
pp. 1-12
Author(s):  
Anna Borgolte ◽  
Ahmad Bransi ◽  
Johanna Seifert ◽  
Sermin Toto ◽  
Gregor R. Szycik ◽  
...  

Abstract Synaesthesia is a multimodal phenomenon in which the activation of one sensory modality leads to an involuntary additional experience in another sensory modality. To date, normal multisensory processing has hardly been investigated in synaesthetes. In the present study we examine processes of audiovisual separation in synaesthesia by using a simultaneity judgement task. Subjects were asked to indicate whether an acoustic and a visual stimulus occurred simultaneously or not. Stimulus onset asynchronies (SOA) as well as the temporal order of the stimuli were systematically varied. Our results demonstrate that synaesthetes are better in separating auditory and visual events than control subjects, but only when vision leads.


2018 ◽  
Vol 7 ◽  
pp. 172-177
Author(s):  
Łukasz Tyburcy ◽  
Małgorzata Plechawska-Wójcik

The paper describes results of comparison of reactions times to visual and auditory stimuli using EEG evoked potentials. Two experiments were used to applied. The first one explored reaction times to visual stimulus and the second one to auditory stimulus. After conducting an analysis of data, received results enable determining that visual stimuli evoke faster reactions than auditory stimuli.


2021 ◽  
Author(s):  
Constantinos Eleftheriou

The goal of this protocol is to assess visuomotor learning and motor flexibility in freely-moving mice, using the Visiomode touchscreen platform. Water-restricted mice first learn to associate touching a visual stimulus on the screen with a water reward. They then learn to discriminate between different visual stimuli on the touchscreen by nose-poking, before asked to switch their motor strategy to forelimb reaching. Version 1 of the protocol uses traditional water deprivation and water rewards in the task as a means of motivating mice to perform the task. Version 2 of the protocol uses Citric Acid for water restriction and sucrose as rewards in the task instead of the traditional water deprivation protocol.


2021 ◽  
pp. 2150048
Author(s):  
Hamidreza Namazi ◽  
Avinash Menon ◽  
Ondrej Krejcar

Our eyes are always in search of exploring our surrounding environment. The brain controls our eyes’ activities through the nervous system. Hence, analyzing the correlation between the activities of the eyes and brain is an important area of research in vision science. This paper evaluates the coupling between the reactions of the eyes and the brain in response to different moving visual stimuli. Since both eye movements and EEG signals (as the indicator of brain activity) contain information, we employed Shannon entropy to decode the coupling between them. Ten subjects looked at four moving objects (dynamic visual stimuli) with different information contents while we recorded their EEG signals and eye movements. The results demonstrated that the changes in the information contents of eye movements and EEG signals are strongly correlated ([Formula: see text]), which indicates a strong correlation between brain and eye activities. This analysis could be extended to evaluate the correlation between the activities of other organs versus the brain.


Sign in / Sign up

Export Citation Format

Share Document