scholarly journals Rational and Flexible Adaptation of Sentence Production to Ongoing Language Experience

2021 ◽  
Vol 12 ◽  
Author(s):  
Malathi Thothathiri

Whether sentences are formulated primarily using lexically based or non-lexically based information has been much debated. In this perspective article, I review evidence for rational flexibility in the sentence production architecture. Sentences can be constructed flexibly via lexically dependent or independent routes, and rationally depending on the statistical properties of the input and the validity of lexical vs. abstract cues for predicting sentence structure. Different neural pathways appear to be recruited for individuals with different executive function abilities and for verbs with different statistical properties, suggesting that alternative routes are available for producing the same structure. Together, extant evidence indicates that the human brain adapts to ongoing language experience during adulthood, and that the nature of the adjustment may depend rationally on the statistical contingencies of the current context.

2020 ◽  
Vol 6 (30) ◽  
pp. eaba7830
Author(s):  
Laurianne Cabrera ◽  
Judit Gervain

Speech perception is constrained by auditory processing. Although at birth infants have an immature auditory system and limited language experience, they show remarkable speech perception skills. To assess neonates’ ability to process the complex acoustic cues of speech, we combined near-infrared spectroscopy (NIRS) and electroencephalography (EEG) to measure brain responses to syllables differing in consonants. The syllables were presented in three conditions preserving (i) original temporal modulations of speech [both amplitude modulation (AM) and frequency modulation (FM)], (ii) both fast and slow AM, but not FM, or (iii) only the slowest AM (<8 Hz). EEG responses indicate that neonates can encode consonants in all conditions, even without the fast temporal modulations, similarly to adults. Yet, the fast and slow AM activate different neural areas, as shown by NIRS. Thus, the immature human brain is already able to decompose the acoustic components of speech, laying the foundations of language learning.


2019 ◽  
Vol 24 (4) ◽  
pp. 729-739 ◽  
Author(s):  
Yusuke Moriguchi ◽  
Kanda Lertladaluck

Aims and objectives: Bilingual children constantly experience spontaneous switching between languages in everyday settings, and some researchers suggest that this experience leads to an advantage in task performance during executive function tasks. Neural processing during executive function tasks remains largely unknown, especially in young bilingual children. Methodology: Using functional near-infrared spectroscopy, this study examined whether young children who attended an immersion second-language program demonstrated enhanced cognitive shifting and lateral prefrontal activation. Data and analysis: We recruited children ( N = 24) who attended an international nursery school, and examined whether their performance on cognitive shifting, and whether the oxygenated hemoglobin changes in the prefrontal regions during the task, were correlated with the children’s second-language verbal age and the length of time the children had been speaking the second language. Findings: Results revealed that the verbal age of the second language and the length of time speaking it were significantly correlated with behavioral performances of cognitive shifting tasks. However, they were not correlated with the activations in the lateral prefrontal regions. Originality: We examined the neural correlates of bilingual effects on cognitive shifting and prefrontal activations in young children. Implications: The results suggest that second-language experience may not be directly related to neural processing in the lateral prefrontal cortex, at least in young children.


Neuroreport ◽  
1998 ◽  
Vol 9 (9) ◽  
pp. 2115-2119 ◽  
Author(s):  
Jack Gandour ◽  
Donald Wong ◽  
Gary Hutchins

2015 ◽  
Vol 58 (4) ◽  
pp. 1182-1194 ◽  
Author(s):  
Jiyeon Lee ◽  
Masaya Yoshida ◽  
Cynthia K. Thompson

PurposeGrammatical encoding (GE) is impaired in agrammatic aphasia; however, the nature of such deficits remains unclear. We examined grammatical planning units during real-time sentence production in speakers with agrammatic aphasia and control speakers, testing two competing models of GE. We queried whether speakers with agrammatic aphasia produce sentences word by word without advanced planning or whether hierarchical syntactic structure (i.e., verb argument structure; VAS) is encoded as part of the advanced planning unit.MethodExperiment 1 examined production of sentences with a predefined structure (i.e., “The A and the B are above the C”) using eye tracking. Experiment 2 tested production of transitive and unaccusative sentences without a predefined sentence structure in a verb-priming study.ResultsIn Experiment 1, both speakers with agrammatic aphasia and young and age-matched control speakers used word-by-word strategies, selecting the first lemma (noun A) only prior to speech onset. However, in Experiment 2, unlike controls, speakers with agrammatic aphasia preplanned transitive and unaccusative sentences, encoding VAS before speech onset.ConclusionsSpeakers with agrammatic aphasia show incremental, word-by-word production for structurally simple sentences, requiring retrieval of multiple noun lemmas. However, when sentences involve functional (thematic to grammatical) structure building, advanced planning strategies (i.e., VAS encoding) are used. This early use of hierarchical syntactic information may provide a scaffold for impaired GE in agrammatism.


Author(s):  
Richard K. Peach

Purpose Analyses of language production of individuals with traumatic brain injury (TBI) place increasing emphasis on microlinguistic (i.e., within-sentence) patterns. It is unknown whether the observed problems involve implementation of well-formed sentence frames or represent a fundamental linguistic disturbance in computing sentence structure. This study investigated the cognitive basis for microlinguistic deficits in individuals with TBI. Method Fifteen nonaphasic individuals with severe TBI and 6 age- and education-matched non brain–injured adults participated in this study. Monologic discourse samples were analyzed for pausing patterns, mazes, errors, and abandoned utterances. Measures of cognitive abilities were correlated with the sentence measures. Results The speakers with TBI produced more pauses between clauses (but not within clauses) as well as more mazes than did the non brain–injured speakers. Significant regression models were built. Raven's Coloured Progressive Matrices (Raven, 1965), a measure associated with working memory, predicted pause behavior, and Likenesses–Differences (Baker & Leland, 1967), a measure of executive function, predicted maze behavior. Conclusions Sentence planning impairments following TBI are associated with deficient organization and monitoring of language representations in working memory. These findings suggest that the deficits are due to problems in the recruitment and control of attention for sentence planning. These findings bear on sentence processing models that emphasize the activation, organization, and maintenance of language representations for accurate sentence production.


Author(s):  
Trevor Brothers ◽  
Liv J Hoversten ◽  
Matthew J Traxler

Abstract Syntactic parsing plays a central role in the interpretation of sentences, but it is unclear to what extent non-native speakers can deploy native-like grammatical knowledge during online comprehension. The current eye-tracking study investigated how Chinese–English bilinguals and native English speakers respond to syntactic category and subcategorization information while reading sentences with object-subject ambiguities. We also obtained measures of English language experience, working memory capacity, and executive function to determine how these cognitive variables influence online parsing. During reading, monolinguals and bilinguals showed similar garden-path effects related to syntactic reanalysis, but native English speakers responded more robustly to verb subcategorization cues. Readers with greater language experience and executive function showed increased sensitivity to verb subcategorization cues, but parsing was not influenced by working memory capacity. These results are consistent with exposure-based accounts of bilingual sentence processing, and they support a link between syntactic processing and domain-general cognitive control.


2021 ◽  
Author(s):  
Julia Catherine Hoyda ◽  
Hannah J Stewart ◽  
Jennifer Vannest ◽  
David R Moore

Listening Difficulties (LiD) are characterized by a child having reported issues with listening despite exhibiting normal hearing thresholds. LiD can often overlap with other developmental disorders, including speech and language disorders, and involve similar higher-order auditory processing. This study used resting-state functional MRI to examine functional brain networks associated with receptive and expressive speech and language and executive function in children with LiD and typically developing (TD) peers (average age of 10 years). We examined differences in region-of-interest (ROI)-to-ROI functional connectivity between: (1) the LiD group and the TD group and (2) within the LiD group, those participants who had seen a Speech-Language Pathologist and those who had not. The latter comparison was examined as a way of comparing children with and without speech and language disorders. Connections that differed between groups were analyzed for correlations with behavioral test data. The results showed functional connectivity differences between the LiD group and TD group in the executive function network and trends in the speech perception network. Differences were also found in the executive network between those LiD participants who had seen an SLP and those who had not. Several of these connectivity differences, particularly frontal-striatal connections, correlated with performance on behavioral tests: including tests that measure attention, executive function, and episodic memory, as well as speech, vocabulary, and sentence structure. The results of this study suggest that differences in functional connectivity in brain networks associated with speech perception and executive function may underlie and contribute to listening difficulties.


Sign in / Sign up

Export Citation Format

Share Document