scholarly journals Effects of Tone Language Experience on Second Language Tone Acquisition

2017 ◽  
Vol 05 (03) ◽  
pp. 234-240
Author(s):  
宇 康
2019 ◽  
Vol 42 (1) ◽  
pp. 33-59 ◽  
Author(s):  
Ricky KW Chan ◽  
Janny HC Leung

AbstractL2 sounds present different kinds of challenges to learners at the phonetic, phonological, and lexical levels, but previous studies on L2 tone learning mostly focused on the phonetic and lexical levels. The present study employs an innovative technique to examine the role of prior tonal experience and musical training on forming novel abstract syllable-level tone categories. Eighty Cantonese and English musicians and nonmusicians completed two tasks: (a) AX tone discrimination and (b) incidental learning of artificial tone-segment connections (e.g., words beginning with an aspirated stop always carry a rising tone) with synthesized stimuli modeled on Thai. Although the four participant groups distinguished the target tones similarly well, Cantonese speakers showed abstract and implicit knowledge of the target tone-segment mappings after training but English speakers did not, regardless of their musical experience. This suggests that tone language experience, but not musical experience, is crucial for forming novel abstract syllable-level tone categories.


2018 ◽  
Vol 38 ◽  
pp. 60-79 ◽  
Author(s):  
Judith F. Kroll ◽  
Paola E. Dussias ◽  
María Teresa Bajo

ABSTRACTBilingualism is a complex life experience. Second language (L2) learning and bilingualism take place in many different contexts. To develop a comprehensive account of dual-language experience requires research that examines individuals who are learning and using two languages in both the first language (L1) and second language (L2) environments. In this article, we review studies that exploit the presence of an international research network on bilingualism to investigate the role of the environment and some the unique characteristics of L2 learning and bilingual language usage in different locations. We ask how the context of learning affects the acquisition of the L2 and the ability to control the use of each language, how language processing is changed by the patterns of language usage in different places (e.g., whether bilinguals have been immersed in the L2 environment for an extended period of time or whether they code-switch), and how the bilingualism of the community itself influences learning and language use.


2019 ◽  
Vol 24 (4) ◽  
pp. 729-739 ◽  
Author(s):  
Yusuke Moriguchi ◽  
Kanda Lertladaluck

Aims and objectives: Bilingual children constantly experience spontaneous switching between languages in everyday settings, and some researchers suggest that this experience leads to an advantage in task performance during executive function tasks. Neural processing during executive function tasks remains largely unknown, especially in young bilingual children. Methodology: Using functional near-infrared spectroscopy, this study examined whether young children who attended an immersion second-language program demonstrated enhanced cognitive shifting and lateral prefrontal activation. Data and analysis: We recruited children ( N = 24) who attended an international nursery school, and examined whether their performance on cognitive shifting, and whether the oxygenated hemoglobin changes in the prefrontal regions during the task, were correlated with the children’s second-language verbal age and the length of time the children had been speaking the second language. Findings: Results revealed that the verbal age of the second language and the length of time speaking it were significantly correlated with behavioral performances of cognitive shifting tasks. However, they were not correlated with the activations in the lateral prefrontal regions. Originality: We examined the neural correlates of bilingual effects on cognitive shifting and prefrontal activations in young children. Implications: The results suggest that second-language experience may not be directly related to neural processing in the lateral prefrontal cortex, at least in young children.


2016 ◽  
Vol 20 (3) ◽  
pp. 642-648 ◽  
Author(s):  
JENNIFER MENJIVAR ◽  
NAMEERA AKHTAR

Four-year-old English speakers (N = 48) who were monolingual, bilingual, or regularly exposed to a second language were taught what they were told were foreign labels for familiar and novel objects. When task demands were low, there was no difference in word learning among the three groups. However, when task demands were higher, bilinguals learned more words than monolingual children, and exposed children's performance fell between the two. These findings indicate that the bilingual word learning advantage seen in adults may begin as early as the preschool years.


1992 ◽  
Vol 14 (2) ◽  
pp. 131-158 ◽  
Author(s):  
Ocke-Schwen Bohn ◽  
James Emil Flege

The study reported in this paper examined the effect of second language (L2) experience on the production of L2 vowels for which acoustic counterparts are either present or absent in the first language (L1). The hypothesis being tested was that amount of L2 experience would not affect L1 German speakers' production of the “similar” English vowels /i, l, ∈/, whereas English language experience would enable L1 Germans to produce an English-like /æ/, which has no counterpart in German. The predictions were tested in two experiments that compared the production of English /i, l, ∈, æ/ by two groups of L1 German speakers differing in English language experience and an L1 English control group. An acoustic experiment compared the three groups for spectral and temporal characteristics of the English vowels produced in /bVt/ words. The same tokens were assessed for intelligibility in a labeling experiment. The results of both experiments were largely consistent with the hypothesis. The experienced L2 speakers did not produce the similar English vowels /i, l, ∈/ more intelligibly than the inexperienced L2 speakers, not did experience have a positive effect on approximating the English acoustic norms for these similar vowels. The intelligibility results for the new vowel /æ/ did not clearly support the model. However, the acoustic comparisons showed that the experienced but not the inexperienced L2 speakers produced the new vowel /æ/ in much the same way as the native English speakers.


2011 ◽  
Vol 33 (3) ◽  
pp. 623-641 ◽  
Author(s):  
XIANGHUA WU ◽  
JUNG-YUEH TU ◽  
YUE WANG

ABSTRACTThe theoretical framework of this study is based on the prevalent debate of whether prosodic processing is influenced by higher level linguistic-specific circuits or reflects lower level encoding of physical properties. Using the dichotic listening technique, the study investigates the hemispheric processing of Japanese pitch accent by native Japanese listeners and two groups of nonnative listeners with no prior pitch accent experience but differing in their native language experience with linguistic pitch: native listeners of Mandarin (a tone language with higher linguistic functional use of pitch) and native listeners of English (a stress language with lower functional use of pitch). The overall results reveal that, for both native and nonnative listeners, the processing of Japanese pitch accent is less lateralized (compared to lexical tone processing, which has been found to be a left hemisphere property). However, detailed analysis with individual pitch accents across groups shows a right hemisphere preference for processing the high–accent–low (H*L) pattern, a left hemisphere preference for LH*, and no hemisphere dominance for LH, indicating a significant reliance on the acoustic cues. These patterns are particularly prominent with the English listeners who are least experienced with linguistic pitch. Together, the findings suggest an interplay of linguistic and acoustic aspects in the processing of Japanese pitch accent by native and nonnative listeners.


2012 ◽  
Vol 15 (2) ◽  
pp. 402-412 ◽  
Author(s):  
DIANE BRENTARI ◽  
MARIE A. NADOLSKE ◽  
GEORGE WOLFORD

In this paper the prosodic structure of American Sign Language (ASL) narratives is analyzed in deaf native signers (L1-D), hearing native signers (L1-H), and highly proficient hearing second language signers (L2-H). The results of this study show that the prosodic patterns used by these groups are associated both with their ASL language experience (L1 or L2) and with their hearing status (deaf or hearing), suggesting that experience using co-speech gesture (i.e. gesturing while speaking) may have some effect on the prosodic cues used by hearing signers, similar to the effects of the prosodic structure of an L1 on an L2.


Sign in / Sign up

Export Citation Format

Share Document