Neural Processing of Amplitude-Modulated Sounds

2004 ◽  
Vol 84 (2) ◽  
pp. 541-577 ◽  
Author(s):  
P. X. JORIS ◽  
C. E. SCHREINER ◽  
A. REES

Joris, P. X., C. E. Schreiner, and A. Rees. Neural Processing of Amplitude-Modulated Sounds. Physiol Rev 84: 541–577, 2004; 10.1152/physrev.00029.2003.—Amplitude modulation (AM) is a temporal feature of most natural acoustic signals. A long psychophysical tradition has shown that AM is important in a variety of perceptual tasks, over a range of time scales. Technical possibilities in stimulus synthesis have reinvigorated this field and brought the modulation dimension back into focus. We address the question whether specialized neural mechanisms exist to extract AM information, and thus whether consideration of the modulation domain is essential in understanding the neural architecture of the auditory system. The available evidence suggests that this is the case. Peripheral neural structures not only transmit envelope information in the form of neural activity synchronized to the modulation waveform but are often tuned so that they only respond over a limited range of modulation frequencies. Ascendingthe auditory neuraxis, AM tuning persists but increasingly takes the form of tuning in average firing rate, rather than synchronization, to modulation frequency. There is a decrease in the highest modulation frequencies that influence the neural response, either in average rate or synchronization, as one records at higher and higher levels along the neuraxis. In parallel, there is an increasing tolerance of modulation tuning for other stimulus parameters such as sound pressure level, modulation depth, and type of carrier. At several anatomical levels, consideration of modulation response properties assists the prediction of neural responses to complex natural stimuli. Finally, some evidence exists for a topographic ordering of neurons according to modulation tuning. The picture that emerges is that temporal modulations are a critical stimulus attribute that assists us in the detection, discrimination, identification, parsing, and localization of acoustic sources and that this wide-ranging role is reflected in dedicated physiological properties at different anatomical levels.

2007 ◽  
Vol 97 (1) ◽  
pp. 522-539 ◽  
Author(s):  
Paul C. Nelson ◽  
Laurel H. Carney

Neural responses to amplitude-modulated (AM) tones in the unanesthetized rabbit inferior colliculus (IC) were studied in an effort to establish explicit relationships between physiological and psychophysical measures of temporal envelope processing. Specifically, responses to variations in modulation depth ( m) at the cell’s best modulation frequency, with and without modulation maskers, were quantified in terms of average rate and synchronization to the envelope over the entire perceptual dynamic range of depths. Statistically significant variations in the metrics were used to define neural AM detection and discrimination thresholds. Synchrony emerged at modulation depths comparable with psychophysical AM detection sensitivities in some neurons, whereas the lowest rate-based neural thresholds could not account for psychoacoustical thresholds. The majority of rate thresholds (85%) were −10 dB or higher (in 20 log m), and 16% of the population exhibited no systematic dependence of average rate on m. Neural thresholds for AM detection did not decrease systematically at higher SPLs (as observed psychophysically): thresholds remained constant or increased with level for most cells tested at multiple sound-pressure levels (SPLs). At depths higher than the rate-based detection threshold, some rate modulation-depth functions were sufficiently steep with respect to the across-trial variability of the rate to predict depth discrimination thresholds as low as 1 dB (comparable with the psychophysics). Synchrony, on the other hand, did not vary systematically with m in many cells at high modulation depths. A simple computational model was extended to reproduce several features of the modulation frequency and depth dependence of both transient and sustained pure-tone responders.


2018 ◽  
Vol 30 (5) ◽  
pp. 1837-1848 ◽  
Author(s):  
Keely A. Muscatell ◽  
Ethan McCormick ◽  
Eva H. Telzer

AbstractAdolescence is a sensitive period for sociocultural development in which facets of social identity, including social status and race, become especially salient. Despite the heightened importance of both social status and race during this developmental period, no known work has examined how individual differences in social status influence perceptions of race in adolescents. Thus, in the present study, we investigated how both subjective social status and objective socioeconomic status (SES) influence neural responses to race. Twenty-three Mexican American adolescents (15 females; mean age = 17.22 years) were scanned using functional magnetic resonance imaging while they viewed Black and White faces in a standard labeling task. Adolescents rated their subjective social status in US society, while their parents responded to questions about their educational background, occupation, and economic strain (objective SES). Results demonstrated a negative association between subjective social status and neural responses in the amygdala, fusiform face area, and medial prefrontal cortex when adolescents viewed Black (relative to White) faces. In other words, adolescents with lower subjective social status showed greater activity in neural regions involved in processing salience, perceptual expertise, and thinking about the minds of others when they viewed images of Black faces, suggesting enhanced salience of race for these youth. There was no relationship between objective SES and neural responses to the faces. Moreover, instructing participants to focus on the gender or emotion expression on the face attenuated the relationship between subjective social status and neural processing of race. Together, these results demonstrate that subjective social status shapes the way the brain responds to race, which may have implications for psychopathology.


2020 ◽  
Author(s):  
Xiaohu Wu ◽  
Xiaoguang Liu ◽  
Mark Hickle ◽  
Dimitrios Peroulis ◽  
Juan Sebastian Gomez-Diaz ◽  
...  

In this paper, we demonstrate, for the first time, an isolating bandpass filter with low-loss forward transmission and high reverse isolation by modulating its constituent resonators. To understand the operating principle behind the device, we develop a spectral domain analysis method and show that the same-frequency nonreciprocity is a result of the nonreciprocal frequency conversion to the intermodulation (IM) frequencies by the time-varying resonators. With appropriate modulation frequency, modulation depth, and phase delay, the signal power at the IM frequencies is converted back to the RF frequency and adds up constructively to form a low-loss forward passband, whereas they add up destructively in the reverse direction to create the isolation. To validate the theory, a lumped-element three-pole 0.04-dB ripple isolating filter with a center frequency of 200 MHz and a ripple bandwidth of 30 MHz is designed, simulated, and measured. When modulated with a sinusoidal frequency of 30 MHz, a modulation index of 0.25, and an incremental phase difference of 45°, the filter achieves a forward insertion loss of 1.5 dB and a reverse isolation of 20 dB. The measured nonmodulated and modulated results agree very well with the simulations. Such nonreciprocal filters may find applications in wideband simultaneous transmit and receive radio front ends.


2021 ◽  
Author(s):  
Heather M Vance ◽  
Peter T Madsen ◽  
Natacha Aguilar de Soto ◽  
Danuta M Wisniewska ◽  
Michael Ladegaard ◽  
...  

Visual predators rely on fast-acting optokinetic reflexes to track and capture agile prey. Most toothed whales, however, rely on echolocation for hunting and have converged on biosonar clicking rates reaching 500/s during prey pursuits. If echoes are processed on a click-by-click basis, as assumed, neural responses 100x faster than those in vision are required to keep pace with this information flow. Using high-resolution bio-logging of wild predator-prey interactions we show that toothed whales adjust clicking rates to track prey movement within 50-200ms of prey escape responses. Hypothesising that these stereotyped biosonar adjustments are elicited by sudden prey accelerations, we measured echo-kinetic responses from trained harbour porpoises to a moving target and found similar latencies. High biosonar sampling rates are, therefore, not supported by extreme speeds of neural processing and muscular responses. Instead, the neuro-kinetic response times in echolocation are similar to those of tracking reflexes in vision, suggesting a common neural underpinning.


2008 ◽  
Vol 100 (3) ◽  
pp. 1602-1609 ◽  
Author(s):  
Bjarne Krebs ◽  
Nicholas A. Lesica ◽  
Benedikt Grothe

Temporal modulations in stimulus amplitude are essential for recognizing and categorizing behaviorally relevant acoustic signals such as speech. Despite this behavioral importance, it remains unclear how amplitude modulations (AMs) are represented in the responses of neurons at higher levels of the auditory system. Studies using stimuli with sinusoidal amplitude modulations (SAMs) have shown that the responses of many neurons are strongly tuned to modulation frequency, leading to the hypothesis that AMs are represented by their periodicity in the auditory midbrain. However, AMs in general are defined not only by their modulation frequency, but also by a number of other parameters (duration, duty cycle, etc.), which covary with modulation frequency in SAM stimuli. Thus the relationship between modulation frequency and neural responses as characterized with SAM stimuli alone is ambiguous. In this study, we characterize the representation of AMs in the gerbil inferior colliculus by analyzing neural responses to a series of pulse trains in which duration and interpulse interval are systematically varied to quantify the importance of duration, interpulse interval, duty cycle, and modulation frequency independently. We find that, although modulation frequency is indeed an important parameter for some neurons, the responses of many neurons are also strongly influenced by other AM parameters, typically duration and duty cycle. These results suggest that AMs are represented in the auditory midbrain not only by their periodicity, but by a complex combination of several important parameters.


Author(s):  
Stephanie M. Nanos

Previous research suggests that humans respond differently to reproductively-relevant information in the environment, including heightened neural responses to sexual versus non-sexual cues. Limited research, however, has examined individual variation in the early neural processing of sexual information. Sexual self-schemas, or one’s views of themselves as a sexual person, provide a stable cognitive framework for processing sexually-relevant information, and may relate to women’s sexual responses. This study seeks to examine how women’s sexual self-schemas relate to the early neural processing of sexual information and their subsequent subjective sexual arousal. Twenty women are being recruited from the Queen’s psychology subject pool and data collection is currently underway. I am assessing women’s neural responses to sexual and non-sexual images (i.e., erect penises versus elbows) using electroencephalography (EEG), and women are reporting their feelings of arousal to the sexual images. Women are also completing a measure of sexual self-schemas. I predict that women who have more positive sexual self-schema scores will have a stronger neural response to sexual stimuli than women with more negative schema scores. In addition, I predict that women with more positive schema scores will self-report higher sexual arousal than women with more negative scores. The findings of this study will improve our understanding of the role of sexual self-schemas and early neural processing in women’s sexual response, thus lending to the development of a comprehensive, empirically-supported model of sexual response that accounts for within-gender variability. 


2018 ◽  
Vol 8 (9) ◽  
pp. 174 ◽  
Author(s):  
Dimitrios Poulimeneas ◽  
Mary Yannakoulia ◽  
Costas Anastasiou ◽  
Nikolaos Scarmeas

Even though obese individuals often succeed with weight loss, long-term weight loss maintenance remains elusive. Dietary, lifestyle and psychosocial correlates of weight loss maintenance have been researched, yet the nature of maintenance is still poorly understood. Studying the neural processing of weight loss maintainers may provide a much-needed insight towards sustained obesity management. In this narrative review, we evaluate and critically discuss available evidence regarding the food-related neural responses of weight loss maintainers, as opposed to those of obese or lean persons. While research is still ongoing, available data indicate that following weight loss, maintainers exhibit persistent reward related feeling over food, similar to that of obese persons. However, unlike in obese persons, in maintainers, reward-related brain activity appears to be counteracted by subsequently heightened inhibition. These findings suggest that post-dieting, maintainers acquire a certain level of cognitive control which possibly protects them from weight regaining. The prefrontal cortex, as well as the limbic system, encompass key regions of interest for weight loss maintenance, and their contributions to long term successful weight loss should be further explored. Future possibilities and supportive theories are discussed.


2014 ◽  
Vol 111 (11) ◽  
pp. 2244-2263 ◽  
Author(s):  
Brian J. Malone ◽  
Brian H. Scott ◽  
Malcolm N. Semple

Changes in amplitude and frequency jointly determine much of the communicative significance of complex acoustic signals, including human speech. We have previously described responses of neurons in the core auditory cortex of awake rhesus macaques to sinusoidal amplitude modulation (SAM) signals. Here we report a complementary study of sinusoidal frequency modulation (SFM) in the same neurons. Responses to SFM were analogous to SAM responses in that changes in multiple parameters defining SFM stimuli (e.g., modulation frequency, modulation depth, carrier frequency) were robustly encoded in the temporal dynamics of the spike trains. For example, changes in the carrier frequency produced highly reproducible changes in shapes of the modulation period histogram, consistent with the notion that the instantaneous probability of discharge mirrors the moment-by-moment spectrum at low modulation rates. The upper limit for phase locking was similar across SAM and SFM within neurons, suggesting shared biophysical constraints on temporal processing. Using spike train classification methods, we found that neural thresholds for modulation depth discrimination are typically far lower than would be predicted from frequency tuning to static tones. This “dynamic hyperacuity” suggests a substantial central enhancement of the neural representation of frequency changes relative to the auditory periphery. Spike timing information was superior to average rate information when discriminating among SFM signals, and even when discriminating among static tones varying in frequency. This finding held even when differences in total spike count across stimuli were normalized, indicating both the primacy and generality of temporal response dynamics in cortical auditory processing.


2011 ◽  
Vol 23 (5) ◽  
pp. 1205-1217 ◽  
Author(s):  
Roozbeh Behroozmand ◽  
Hanjun Liu ◽  
Charles R. Larson

The neural responses to sensory consequences of a self-produced motor act are suppressed compared with those in response to a similar but externally generated stimulus. Previous studies in the somatosensory and auditory systems have shown that the motor-induced suppression of the sensory mechanisms is sensitive to delays between the motor act and the onset of the stimulus. The present study investigated time-dependent neural processing of auditory feedback in response to self-produced vocalizations. ERPs were recorded in response to normal and pitch-shifted voice auditory feedback during active vocalization and passive listening to the playback of the same vocalizations. The pitch-shifted stimulus was delivered to the subjects' auditory feedback after a randomly chosen time delay between the vocal onset and the stimulus presentation. Results showed that the neural responses to delayed feedback perturbations were significantly larger than those in response to the pitch-shifted stimulus occurring at vocal onset. Active vocalization was shown to enhance neural responsiveness to feedback alterations only for nonzero delays compared with passive listening to the playback. These findings indicated that the neural mechanisms of auditory feedback processing are sensitive to timing between the vocal motor commands and the incoming auditory feedback. Time-dependent neural processing of auditory feedback may be an important feature of the audio-vocal integration system that helps to improve the feedback-based monitoring and control of voice structure through vocal error detection and correction.


2022 ◽  
Author(s):  
Ruosi Wang ◽  
Daniel Janini ◽  
Talia Konkle

Responses to visually-presented objects along the cortical surface of the human brain have a large-scale organization reflecting the broad categorical divisions of animacy and object size. Mounting evidence indicates that this topographical organization is driven by differences between objects in mid-level perceptual features. With regard to the timing of neural responses, images of objects quickly evoke neural responses with decodable information about animacy and object size, but are mid-level features sufficient to evoke these rapid neural responses? Or is slower iterative neural processing required to untangle information about animacy and object size from mid-level features? To answer this question, we used electroencephalography(EEG) to measure human neural responses to images of objects and their texform counterparts - unrecognizable images which preserve some mid-level feature information about texture and coarse form. We found that texform images evoked neural responses with early decodable information about both animacy and real-world size, as early as responses evoked by original images. Further, successful cross-decoding indicates that both texform and original images evoke information about animacy and size through a common underlying neural basis. Broadly, these results indicate that the visual system contains a mid-level feature bank carrying linearly decodable information on animacy and size, which can be rapidly activated without requiring explicit recognition or protracted temporal processing.


Sign in / Sign up

Export Citation Format

Share Document