AbstractThe neural basis of language has been studied for centuries, yet the networks critically involved in simply identifying or understanding a spoken word remain elusive. Several functional-anatomical models of the critical neural substrates of receptive speech processes have been proposed, including auditory-related regions in the (1) left mid-posterior superior temporal lobe, (2) left anterior superior temporal lobe, or (3) bilateral in the mid-posterior superior temporal areas, and (4) motor-related regions in the left frontal lobe (in normal and/or noisy conditions). One difficulty in comparing these models is that they often focus on different aspects of the sound-to-meaning pathway and different types of stimuli and tasks. Two auditory tasks that are typically used in separate studies—nonword discrimination and word comprehension—often yield different conclusions. We assessed word/nonword discrimination and clear/noisy word comprehension in 160 individuals with focal brain damage: left (n=115) or right (n=19) hemisphere stroke, left (n=18) or right (n=8) anterior temporal lobectomy, and 26 neurologically-intact controls. Discrimination and comprehension doubly dissociated both behaviorally and neurologically. In support of a bilateral model, clear speech comprehension was near ceiling in 92% of left stroke cases, and right temporal damage was, for the first time, found to impair phoneme discrimination. Lesion-symptom mapping analyses for the word discrimination, non-word discrimination, and noisy word comprehension tasks each implicated most of the left superior temporal gyrus (STG) except the anterior most regions. Comprehension tasks also implicated the left pMTG, while discrimination tasks also implicated more dorsal sensorimotor regions in posterior perisylvian cortex.