A neural decoding algorithm that generates language from visual activity evoked by natural images

2021 ◽  
Author(s):  
Wei Huang ◽  
Hongmei Yan ◽  
Kaiwen Cheng ◽  
Chong Wang ◽  
Jiyi Li ◽  
...  
Author(s):  
Young Joon Kim ◽  
Nora Brackbill ◽  
Ella Batty ◽  
JinHyung Lee ◽  
Catalin Mitelut ◽  
...  

AbstractDecoding sensory stimuli from neural activity can provide insight into how the nervous system might interpret the physical environment, and facilitates the development of brain-machine interfaces. Nevertheless, the neural decoding problem remains a significant open challenge. Here, we present an efficient nonlinear decoding approach for inferring natural scene stimuli from the spiking activities of retinal ganglion cells (RGCs). Our approach uses neural networks to improve upon existing decoders in both accuracy and scalability. Trained and validated on real retinal spike data from > 1000 simultaneously recorded macaque RGC units, the decoder demonstrates the necessity of nonlinear computations for accurate decoding of the fine structures of visual stimuli. Specifically, high-pass spatial features of natural images can only be decoded using nonlinear techniques, while low-pass features can be extracted equally well by linear and nonlinear methods. Together, these results advance the state of the art in decoding natural stimuli from large populations of neurons.Author summaryNeural decoding is a fundamental problem in computational and statistical neuroscience. There is an enormous literature on this problem, applied to a wide variety of brain areas and nervous systems. Here we focus on the problem of decoding visual information from the retina. The bulk of previous work here has focused on simple linear decoders, applied to modest numbers of simultaneously recorded cells, to decode artificial stimuli. In contrast, here we develop a scalable nonlinear decoding method to decode natural images from the responses of over a thousand simultaneously recorded units, and show that this decoder significantly improves on the state of the art.


2020 ◽  
Vol 41 (15) ◽  
pp. 4442-4453 ◽  
Author(s):  
Wei Huang ◽  
Hongmei Yan ◽  
Chong Wang ◽  
Jiyi Li ◽  
Xiaoqing Yang ◽  
...  

2021 ◽  
pp. 1-32
Author(s):  
Young Joon Kim ◽  
Nora Brackbill ◽  
Eleanor Batty ◽  
JinHyung Lee ◽  
Catalin Mitelut ◽  
...  

Abstract Decoding sensory stimuli from neural activity can provide insight into how the nervous system might interpret the physical environment, and facilitates the development of brain-machine interfaces. Nevertheless, the neural decoding problem remains a significant open challenge. Here, we present an efficient nonlinear decoding approach for inferring natural scene stimuli from the spiking activities of retinal ganglion cells (RGCs). Our approach uses neural networks to improve on existing decoders in both accuracy and scalability. Trained and validated on real retinal spike data from more than 1000 simultaneously recorded macaque RGC units, the decoder demonstrates the necessity of nonlinear computations for accurate decoding of the fine structures of visual stimuli. Specifically, high-pass spatial features of natural images can only be decoded using nonlinear techniques, while low-pass features can be extracted equally well by linear and nonlinear methods. Together, these results advance the state of the art in decoding natural stimuli from large populations of neurons.


Author(s):  
Yuki HAYAMI ◽  
Daiki TAKASU ◽  
Hisakazu AOYANAGI ◽  
Hiroaki TAKAMATSU ◽  
Yoshifumi SHIMODAIRA ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document