On the Studies of Syllable Segmentation and Improving MFCCs for Automatic Birdsong Recognition

Author(s):  
Chih-Hsun Chou ◽  
Pang-Hsin Liu ◽  
Bingjing Cai
1988 ◽  
Vol 53 (3) ◽  
pp. 316-327 ◽  
Author(s):  
Alan G. Kamhi ◽  
Hugh W. Catts ◽  
Daria Mauer ◽  
Kenn Apel ◽  
Betholyn F. Gentry

In the present study, we further examined (see Kamhi & Catts, 1986) the phonological processing abilities of language-impaired (LI) and reading-impaired (RI) children. We also evaluated these children's ability to process spatial information. Subjects were 10 LI, 10 RI, and 10 normal children between the ages of 6:8 and 8:10 years. Each subject was administered eight tasks: four word repetition tasks (monosyllabic, monosyllabic presented in noise, three-item, and multisyllabic), rapid naming, syllable segmentation, paper folding, and form completion. The normal children performed significantly better than both the LI and RI children on all but two tasks: syllable segmentation and repeating words presented in noise. The LI and RI children performed comparably on every task with the exception of the multisyllabic word repetition task. These findings were consistent with those from our previous study (Kamhi & Catts, 1986). The similarities and differences between LI and RI children are discussed.


2021 ◽  
Vol 3 (5) ◽  
pp. 397-406
Author(s):  
Carlos Arce-Lopera ◽  
María José Arias ◽  
Gustavo Corrales

Complexity ◽  
2019 ◽  
Vol 2019 ◽  
pp. 1-21 ◽  
Author(s):  
Karl D. Neergaard ◽  
Chu-Ren Huang

The purpose of this study was to construct, measure, and identify a schematic representation of phonological processing in the tonal language Mandarin Chinese through the combination of network science and psycholinguistic tasks. Two phonological association tasks were performed with native Mandarin speakers to identify an optimal phonological annotation system. The first task served to compare two existing syllable inventories and to construct a novel system where either performed poorly. The second task validated the novel syllable inventory. In both tasks, participants were found to manipulate lexical items at each possible syllable location, but preferring to maintain whole syllables while manipulating lexical tone in their search through the mental lexicon. The optimal syllable inventory was then used as the basis of a Mandarin phonological network. Phonological edit distance was used to construct sixteen versions of the same network, which we titled phonological segmentation neighborhoods (PSNs). The sixteen PSNs were representative of every proposal to date of syllable segmentation. Syllable segmentation and whether or not lexical tone was treated as a unit both affected the PSNs’ topologies. Finally, reaction times from the second task were analyzed through a model selection procedure with the goal of identifying which of the sixteen PSNs best accounted for the mental target during the task. The identification of the tonal complex-vowel segmented PSN (C_V_C_T) was indicative of the stimuli characteristics and the choices participants made while searching through the mental lexicon. The analysis revealed that participants were inhibited by greater clustering coefficient (interconnectedness of words according to phonological similarity) and facilitated by lexical frequency. This study illustrates how network science methods add to those of psycholinguistics to give insight into language processing that was not previously attainable.


1998 ◽  
Vol 21 (4) ◽  
pp. 519-520 ◽  
Author(s):  
Uwe Jürgens

The segmentation of phonation by articulation is a characteristic feature of speech that distinguishes it from most nonhuman vocalizations. However, apart from the trivial fact that speech uses some of the same muscles and, hence the same motoneurons and motorcortical areas used in chewing, there is no convincing evidence that syllable segmentation relies on the same pattern generator as mastication. Evidence for a differential cortical representation of syllable segmentation (“frame”) and syllable “content” is also meager.


Sign in / Sign up

Export Citation Format

Share Document