scholarly journals When Heuristics Clash with Parsing Routines: ERP Evidence for Conflict Monitoring in Sentence Perception

2006 ◽  
Vol 18 (7) ◽  
pp. 1181-1197 ◽  
Author(s):  
Marieke van Herten ◽  
Dorothee J. Chwilla ◽  
Herman H. J. Kolk

Monitoring refers to a process of quality control designed to optimize behavioral outcome. Monitoring for action errors manifests itself in an error-related negativity in event-related potential (ERP) studies and in an increase in activity of the anterior cingulate in functional magnetic resonance imaging studies. Here we report evidence for a monitoring process in perception, in particular, language perception, manifesting itself in a late positivity in the ERP. This late positivity, the P600, appears to be triggered by a conflict between two interpretations, one delivered by the standard syntactic algorithm and one by a plausibility heuristic which combines individual word meanings in the most plausible way. To resolve this conflict, we propose that the brain reanalyzes the memory trace of the perceptual input to check for the possibility of a processing error. Thus, as in Experiment 1, when the reader is presented with semantically anomalous sentences such as, “The fox that shot the poacher…,” full syntactic analysis indicates a semantic anomaly, whereas the word-based heuristic leads to a plausible interpretation, that of a poacher shooting a fox. That readers actually pursue such a word-based analysis is indicated by the fact that the usual ERP index of semantic anomaly, the so-called N400 effect, was absent in this case. A P600 effect appeared instead. In Experiment 2, we found that even when the word-based heuristic indicated that only part of the sentence was plausible (e.g., “…that the elephants pruned the trees”), a P600 effect was observed and the N400 effect of semantic anomaly was absent. It thus seems that the plausibility of part of the sentence (e.g., that of pruning trees) was sufficient to create a conflict with the implausible meaning of the sentence as a whole, giving rise to a monitoring response.

2020 ◽  
Vol 46 (Supplement_1) ◽  
pp. S50-S50
Author(s):  
Jihye Park ◽  
Minah Kim ◽  
Wu Jeong Hwang ◽  
Jun Soo Kwon

Abstract Background Impaired error/conflict monitoring as reflected in the event-related potentials (ERPs) has consistently reported in patients with schizophrenia. However, whether this impairment exist from the early phase of psychosis such as first-episode psychosis (FEP) is not yet been clearly reported. To investigate the presence of error/conflict monitoring deficit in early psychosis, we examined the error-related negativity (ERN), error-related positivity (Pe), and correct-response negativity (CRN) during the Go/Nogo task in the patients with FEP. Methods 25 patients with and 25 age, sex matched healthy controls (HCs) were participated in electroencephalographic recording during the Go/Nogo task. Trials with error response was analyzed to define ERN at Fz electrode site and Pe at Pz electrode site. Trials with correct response was used for CRN analysis at Fz electrode site. Independent samples t-test was used to compare the amplitudes of ERP components between FEP and HC groups. Pearson’s correlation analysis was performed to reveal the relationship of altered ERP component with symptomatic severity in patients with schizophrenia. Results FEP patients showed significantly smaller ERN amplitude at Fz electrode site compared to HCs (t=-3.294, p=0.002). However, there was no difference of CRN (t=0.017, p=0.986) and Pe (t=1.806, p=0.077) amplitudes between FEP and HC groups. There was no significant correlation of symptomatic severity and ERN amplitude at Fz electrode site in FEP patients. Discussion These findings suggest that impairments in error/conflict monitoring as reflected by ERN amplitude exist from the early course of psychotic disorder. Future study with larger sample size and subjects at earlier phase such as clinical high risk for psychosis would be needed to confirm the findings of current study.


2008 ◽  
Vol 20 (9) ◽  
pp. 1637-1655 ◽  
Author(s):  
Borís Burle ◽  
Clémence Roger ◽  
Sonia Allain ◽  
Franck Vidal ◽  
Thierry Hasbroucq

Our ability to detect and correct errors is essential for our adaptive behavior. The conflict-loop theory states that the anterior cingulate cortex (ACC) plays a key role in detecting the need to increase control through conflict monitoring. Such monitoring is assumed to manifest itself in an electroencephalographic (EEG) component, the “error negativity” (Ne or “error-related negativity” [ERN]). We have directly tested the hypothesis that the ACC monitors conflict through simulation and experimental studies. Both the simulated and EEG traces were sorted, on a trial-by-trial basis, as a function of the degree of conflict, measured as the temporal overlap between incorrect and correct response activations. The simulations clearly show that conflict increases as temporal overlap between response activation increases, whereas the experimental results demonstrate that the amplitude of the Ne decreases as temporal overlap increases, suggesting that the ACC does not monitor conflict. At a functional level, the results show that the duration of the Ne depends on the time needed to correct (partial) errors, revealing an “on-line” modulation of control on a very short time scale.


2013 ◽  
Vol 27 (3) ◽  
pp. 113-123 ◽  
Author(s):  
Evelien Kostermans ◽  
Renske Spijkerman ◽  
Rutger C. M. E. Engels ◽  
Harold Bekkering ◽  
Ellen R. A. de Bruijn

Different theoretical accounts have attempted to integrate anterior cingulate cortex involvement in relation to conflict detection, error-likelihood predictions, and error monitoring. Regarding the latter, event-related potential studies have identified the feedback-related negativity (FRN) component in relation to processing feedback which indicates that a particular outcome was worse than expected. According to the conflict-monitoring theory the stimulus-locked N2 reflects pre-response conflict. Assumptions of these theories have been made on the basis of relatively simple response-mapping tasks, rather than more complex decision-making processes associated with everyday situations. The question remains whether expectancies and conflicts induced by everyday knowledge similarly affect decision-making processes. To answer this question, electroencephalogram and behavioral measurements were obtained while participants performed a simulated traffic task that varied high and low ambiguous situations at an intersection by presenting multiple varying traffic light combinations. Although feedback was kept constant for the different conditions, the tendency to cross was more pronounced for traffic light combinations that in reallife are associated with proceeding, as opposed to more ambiguous traffic light combinations not uniquely associated with a specific response. On a neurophysiological level, the stimulus-locked N2 was enhanced on trials that induced experience-based conflict and the FRN was more pronounced for negative as compared to positive feedback, but did not differ as a function of everyday expectancies related to traffic rules. The current study shows that well-learned everyday rules may influence decision-making processes in situations that are associated with the application of these rules, even if responding accordingly does not lead to the intended outcomes.


2009 ◽  
Vol 21 (4) ◽  
pp. 684-696 ◽  
Author(s):  
Rongjun Yu ◽  
Xiaolin Zhou

The functional significance of error-related negativity (Ne/ERN), which occurs at approximately the same time as erroneous responses, has been investigated extensively using reaction time (RT) tasks. The error detection theory assumes that the Ne/ERN reflects the mismatch detected by comparing representations of the intended and the actually performed actions. The conflict monitoring theory asserts that the Ne/ERN reflects the detection of response conflict between intended and actually performed actions during response selection. In this study, we employed a gambling task in which participants were required to choose whether they would take part in betting in each trial and they were presented with gain or loss feedback in both the “to bet” and the “not to bet” trials. The response-locked ERP magnitudes were more negative for “to bet” than for “not to bet” choices for both large and small stakes and were more negative for choices involving large rather than small stakes. Dipole source analysis localized the ERP responses to the anterior cingulate cortex (ACC). These findings suggest that the ACC signals the riskiness of choices and may function as an early warning system that alerts the brain to prepare for the potential negative consequence associated with a risky action.


2007 ◽  
Vol 19 (7) ◽  
pp. 1104-1112 ◽  
Author(s):  
Mike Wendt ◽  
Marcus Heldmann ◽  
Thomas F. Münte ◽  
Rainer H. Kluwe

Conflict monitoring theory holds that detection of conflicts in information processing by the anterior cingulate cortex (ACC) results in processing adaptation that minimizes subsequent conflict. Applying an Eriksen f lanker task with four stimuli mapped onto two responses, we investigated whether such modulation occurs only after response-related or also after stimulus-related conflict, focusing on the N2 component of the event-related potential. Contrasting with previous findings, both stimulus- and response-related conflict elicited enhancement of the N2, suggesting that the ACC is sensitive to conflict at both the stimulus and the response level. However, neither type of conflict resulted in reduced conflict effects on the following trial when stimulus-response (S-R) sequence effects were controlled by excluding identical S-R repetition trials. Identical S-R repetitions were associated with facilitated processing, thus demonstrating that inclusion of these trials in the analysis may mimic results predicted by the conflict adaptation hypothesis.


2007 ◽  
Vol 19 (12) ◽  
pp. 1994-2004 ◽  
Author(s):  
Flavio T. P. Oliveira ◽  
John J. McDonald ◽  
David Goodman

Several converging lines of evidence suggest that the anterior cingulate cortex (ACC) is selectively involved in error detection or evaluation of poor performance. Here we challenge this notion by presenting event-related potential (ERP) evidence that the feedback-elicited error-related negativity, an ERP component attributed to the ACC, can be elicited by positive feedback when a person is expecting negative feedback and vice versa. These results suggest that performance monitoring in the ACC is not limited to error processing. We propose that the ACC acts as part of a more general performance-monitoring system that is activated by violations in expectancy. Further, we propose that the common observation of increased ACC activity elicited by negative events could be explained by an overoptimistic bias in generating expectations of performance. These results could shed light into neurobehavioral disorders, such as depression and mania, associated with alterations in performance monitoring and also in judgments of self-related events.


2011 ◽  
Vol 23 (11) ◽  
pp. 3181-3196 ◽  
Author(s):  
Laura Batterink ◽  
Helen Neville

The vast majority of word meanings are learned simply by extracting them from context rather than by rote memorization or explicit instruction. Although this skill is remarkable, little is known about the brain mechanisms involved. In the present study, ERPs were recorded as participants read stories in which pseudowords were presented multiple times, embedded in consistent, meaningful contexts (referred to as meaning condition, M+) or inconsistent, meaningless contexts (M−). Word learning was then assessed implicitly using a lexical decision task and explicitly through recall and recognition tasks. Overall, during story reading, M− words elicited a larger N400 than M+ words, suggesting that participants were better able to semantically integrate M+ words than M− words throughout the story. In addition, M+ words whose meanings were subsequently correctly recognized and recalled elicited a more positive ERP in a later time window compared with M+ words whose meanings were incorrectly remembered, consistent with the idea that the late positive component is an index of encoding processes. In the lexical decision task, no behavioral or electrophysiological evidence for implicit priming was found for M+ words. In contrast, during the explicit recognition task, M+ words showed a robust N400 effect. The N400 effect was dependent upon recognition performance, such that only correctly recognized M+ words elicited an N400. This pattern of results provides evidence that the explicit representations of word meanings can develop rapidly, whereas implicit representations may require more extensive exposure or more time to emerge.


2020 ◽  
pp. 174702182098462
Author(s):  
Masataka Yano ◽  
Shugo Suwazono ◽  
Hiroshi Arao ◽  
Daichi Yasunaga ◽  
Hiroaki Oishi

The present study conducted two event-related potential experiments to investigate whether readers adapt their expectations to morphosyntactically (Experiment 1) or semantically (Experiment 2) anomalous sentences when they are repeatedly exposed to them. To address this issue, we manipulated the probability of morphosyntactically/semantically grammatical and anomalous sentence occurrence through experiments. For the low probability block, anomalous sentences were presented less frequently than grammatical sentences (with a ratio of 1 to 4), while they were presented as frequently as grammatical sentences in the equal probability block. Experiment 1 revealed a smaller P600 effect for morphosyntactic violations in the equal probability block than in the low probability block. Linear mixed-effect models were used to examine how the size of the P600 effect changed as the experiment went along. The results showed that the smaller P600 effect of the equal probability block resulted from an amplitude’s decline in morphosyntactically violated sentences over the course of the experiment, suggesting an adaptation to morphosyntactic violations. In Experiment 2, semantically anomalous sentences elicited a larger N400 effect than their semantically natural counterparts regardless of probability manipulation. No evidence was found in favor of adaptation to semantic violations in that the processing cost of semantic violations did not decrease over the course of the experiment. Therefore, the present study demonstrated a dynamic aspect of language-processing system. We will discuss why the language-processing system shows a selective adaptation to morphosyntactic violations.


Author(s):  
Anne Schienle ◽  
Albert Wabnegger

AbstractAn extremely bitter taste can signal food spoilage, and therefore typically elicits disgust. The present cross-modal functional magnetic resonance imaging experiment investigated whether the personality trait ‘disgust propensity’ (DP; temporally stable tendency to experience disgust across different situations) has an influence on the processing of visual food cues during bitter aftertaste perception. Thirty females with high DP and 30 females with low DP viewed images depicting sweet food (e.g., cakes, ice cream) and vegetables, once in combination with an extremely bitter aftertaste (concentrated wormwood tea), and once with a neutral taste (water). Females highly prone to disgust (compared to low disgust-prone females) showed increased activity in the anterior cingulate cortex (ACC) and increased mPFC-insula connectivity when presented with the mismatch of a bitter aftertaste and visual cues of sweet food. The ACC is involved in conflict monitoring and is strongly interconnected with insular areas. This connection plays a critical role in awareness of changes in homeostatic states. Our findings indicate that the personality trait DP is associated with cross-modal integration processes of disgust-relevant information. Females high in DP were more alert to food-related sensory mismatch (pleasant visual features, aversive taste) than females low in DP.


2009 ◽  
Vol 21 (11) ◽  
pp. 2245-2262 ◽  
Author(s):  
Daphne J. Holt ◽  
Spencer K. Lynn ◽  
Gina R. Kuperberg

Although the neurocognitive mechanisms of nonaffective language comprehension have been studied extensively, relatively less is known about how the emotional meaning of language is processed. In this study, electrophysiological responses to affectively positive, negative, and neutral words, presented within nonconstraining, neutral contexts, were evaluated under conditions of explicit evaluation of emotional content (Experiment 1) and passive reading (Experiment 2). In both experiments, a widely distributed Late Positivity was found to be larger to negative than to positive words (a “negativity bias”). In addition, in Experiment 2, a small, posterior N400 effect to negative and positive (relative to neutral) words was detected, with no differences found between N400 magnitudes to negative and positive words. Taken together, these results suggest that comprehending the emotional meaning of words following a neutral context requires an initial semantic analysis that is relatively more engaged for emotional than for nonemotional words, whereas a later, more extended, attention-modulated process distinguishes the specific emotional valence (positive vs. negative) of words. Thus, emotional processing networks within the brain appear to exert a continuous influence, evident at several stages, on the construction of the emotional meaning of language.


Sign in / Sign up

Export Citation Format

Share Document