Speech Perception
Speech perception can be thought of as the set of operations that take as input the continuously varying acoustic waveforms available at the auditory periphery (vibrations in the ear) and that generate as output those representations (abstractions in the head) that constitute the basis for the subsequent operations that mediate language comprehension (which can, of course, be fed by audition, vision, or touch). The neural basis of speech perception proper has been studied experimentally by every available neural recording and stimulation technique. The interpretation of the findings and the development of a comprehensive mechanistic theory are complicated by the fact that very different protocols are used: studies range from the identification and categorization of single vowels and syllables to decisions on single spoken words to intelligibility judgments on connected speech. Within this broader context, three topics have received considerable attention: the hemispheric lateralization of speech perception, the role of the motor system, and the potential contribution of neural oscillations to perceptual analysis. This chapter discusses each of these areas in turn.