A self-organized sentence processing theory of gradience: The case of islands

Cognition ◽  
2022 ◽  
Vol 222 ◽  
pp. 104943
Author(s):  
Sandra Villata ◽  
Whitney Tabor
2019 ◽  
Author(s):  
Garrett Smith ◽  
Shravan Vasishth

Among theories of human language comprehension, cue-based memory retrieval has proven to be a useful framework for understanding when and how processing difficulty arises in the resolution of long-distance dependencies. Most previous work in this area has assumed that very general retrieval cues like [+subject] or [+singular] do the work of identifying (and sometimes misidentifying) a retrieval target in order to establish a dependency between words. However, recent work suggests that general, hand-picked retrieval cues like these may not be enough to explain illusions of plausibility (Cunnings & Sturt, 2018), which can arise in sentences like The letter next to the porcelain plate shattered. Capturing such retrieval interference effects requires lexically specific features and retrieval cues, but hand-picking the features is hard to do in a principled way and greatly increases modeler degrees of freedom. To remedy this, we use word embeddings, a well-established method for creating distributed feature representations, for lexical features and retrieval cues. We show that the similarity between the features and the cues (a measure of plausibility) predicts total reading times in Cunnings and Sturt’s eye-tracking data. The features can easily be plugged into existing parsing models (including cue-based retrieval and self-organized parsing), putting very different models on more equal footing and facilitating future quantitative comparisons. In addition to this methodological contribution, our results suggest that, contrary to Cunnings and Sturts’ original conclusions, focused words might be more prominent in memory, making them less susceptible to interference, as predicted by a recent extension to ACT-R (Engelmann, Jäger, & Vasishth, 2019).


2021 ◽  
Author(s):  
Dario Paape ◽  
Shravan Vasishth ◽  
Ralf Engbert

Local coherence effects arise when the human sentence processor is temporarily misled by a locally grammatical but globally ungrammatical analysis ("The coach smiled at THE PLAYER TOSSED A FRISBEE by the opposing team"). It has been suggested that such effects occur either because sentence processing occurs in a bottom-up, self-organized manner rather than being under constant grammatical supervision (Tabor, Galantucci, & Richardson, 2004), or because local coherence can disrupt processing due to readers maintaining uncertainty about previous input (Levy, 2008). We report the results of an eye-tracking study in which subjects read German grammatical and ungrammatical sentences that either contained a locally coherent substring or not and gave binary grammaticality judgments. In our data, local coherence affected on-line processing immediately at the point of the manipulation. There was, however, no indication that local coherence led to illusions of grammaticality (a prediction of self-organization), and only weak, inconclusive support for local coherence leading to targeted regressions to critical context words (a prediction of the uncertain-input approach). We discuss implications for self-organized and noisy-channel models of local coherence.


2021 ◽  
Vol 124 ◽  
pp. 101356
Author(s):  
Garrett Smith ◽  
Julie Franck ◽  
Whitney Tabor

Open Mind ◽  
2021 ◽  
pp. 1-17
Author(s):  
Dario Paape ◽  
Shravan Vasishth ◽  
Ralf Engbert

Abstract Local coherence effects arise when the human sentence processor is temporarily misled by a locally grammatical but globally ungrammatical analysis (The coach smiled at the player tossed a frisbee by the opposing team). It has been suggested that such effects occur either because sentence processing occurs in a bottom-up, self-organized manner rather than under constant grammatical supervision (Tabor et al., 2004), or because local coherence can disrupt processing due to readers maintaining uncertainty about previous input (Levy, 2008b). We report the results of an eye-tracking study in which subjects read German grammatical and ungrammatical sentences that either contained a locally coherent substring or not and gave binary grammaticality judgments. In our data, local coherence affected on-line processing immediately at the point of the manipulation. There was, however, no indication that local coherence led to illusions of grammaticality (a prediction of self-organization), and only weak, inconclusive support for local coherence leading to targeted regressions to critical context words (a prediction of the uncertain-input approach). We discuss implications for self-organized and noisy-channel models of local coherence.


2019 ◽  
Vol 42 ◽  
Author(s):  
Lucio Tonello ◽  
Luca Giacobbi ◽  
Alberto Pettenon ◽  
Alessandro Scuotto ◽  
Massimo Cocchi ◽  
...  

AbstractAutism spectrum disorder (ASD) subjects can present temporary behaviors of acute agitation and aggressiveness, named problem behaviors. They have been shown to be consistent with the self-organized criticality (SOC), a model wherein occasionally occurring “catastrophic events” are necessary in order to maintain a self-organized “critical equilibrium.” The SOC can represent the psychopathology network structures and additionally suggests that they can be considered as self-organized systems.


Author(s):  
Margreet Vogelzang ◽  
Christiane M. Thiel ◽  
Stephanie Rosemann ◽  
Jochem W. Rieger ◽  
Esther Ruigendijk

Purpose Adults with mild-to-moderate age-related hearing loss typically exhibit issues with speech understanding, but their processing of syntactically complex sentences is not well understood. We test the hypothesis that listeners with hearing loss' difficulties with comprehension and processing of syntactically complex sentences are due to the processing of degraded input interfering with the successful processing of complex sentences. Method We performed a neuroimaging study with a sentence comprehension task, varying sentence complexity (through subject–object order and verb–arguments order) and cognitive demands (presence or absence of a secondary task) within subjects. Groups of older subjects with hearing loss ( n = 20) and age-matched normal-hearing controls ( n = 20) were tested. Results The comprehension data show effects of syntactic complexity and hearing ability, with normal-hearing controls outperforming listeners with hearing loss, seemingly more so on syntactically complex sentences. The secondary task did not influence off-line comprehension. The imaging data show effects of group, sentence complexity, and task, with listeners with hearing loss showing decreased activation in typical speech processing areas, such as the inferior frontal gyrus and superior temporal gyrus. No interactions between group, sentence complexity, and task were found in the neuroimaging data. Conclusions The results suggest that listeners with hearing loss process speech differently from their normal-hearing peers, possibly due to the increased demands of processing degraded auditory input. Increased cognitive demands by means of a secondary visual shape processing task influence neural sentence processing, but no evidence was found that it does so in a different way for listeners with hearing loss and normal-hearing listeners.


Sign in / Sign up

Export Citation Format

Share Document