scholarly journals Reconstructed Brain Like Neural Network R-KFDNN

Author(s):  
zhu rongrong

Abstract Through the neural system damage and repair process of human brain, we can construct the complex deep learning and training of the repair process such as the damage of brain like high-dimensional flexible neural network system or the local loss of data, so as to prevent the dimensional disaster caused by the local loss of high-dimensional data. How to recover and extract feature information when the damaged neural system (flexible neural network) has amnesia or local loss of stored information. Information extraction generally exists in the distribution table of the generation sequence of the key group of the higher dimension or the lower dimension to find the core data stored in the brain. The generation sequence of key group exists in a hidden time tangent cluster. Brain like slice data processing runs on different levels, different dimensions, different tangent clusters and cotangent clusters. The key group in the brain can be regarded as the distribution table of memory fragments. Memory parsing has mirror reflection and is accompanied by the loss of local random data. In the compact compressed time tangent cluster, it freely switches to the high-dimensional information field, and the parsed key is buried in the information.

2021 ◽  
Author(s):  
zhu rongrong

Abstract Through the neural system damage and repair process of human brain, we can construct the complex deep learning and training of the repair process such as the damage of brain like high-dimensional flexible neural network system or the local loss of data, so as to prevent the dimensional disaster caused by the local loss of high-dimensional data. How to recover and extract feature information when the damaged neural system (flexible neural network) has amnesia or local loss of stored information. Information extraction generally exists in the distribution table of the generation sequence of the key group of the higher dimension or the lower dimension to find the core data stored in the brain. The generation sequence of key group exists in a hidden time tangent cluster. Brain like slice data processing runs on different levels, different dimensions, different tangent clusters and cotangent clusters. The key group in the brain can be regarded as the distribution table of memory fragments. Memory parsing has mirror reflection and is accompanied by the loss of local random data. In the compact compressed time tangent cluster, it freely switches to the high-dimensional information field, and the parsed key is buried in the information.


2021 ◽  
Author(s):  
zhu rongrong

Abstract Through the neural system damage and repair process of human brain, we can construct the complex deep learning and training of the repair process such as the damage of brain like high-dimensional flexible neural network system or the local loss of data, so as to prevent the dimensional disaster caused by the local loss of high-dimensional data. How to recover and extract feature information when the damaged neural system (flexible neural network) has amnesia or local loss of stored information. Information extraction generally exists in the distribution table of the generation sequence of the key group of the higher dimension or the lower dimension to find the core data stored in the brain. The generation sequence of key group exists in a hidden time tangent cluster. Brain like slice data processing runs on different levels, different dimensions, different tangent clusters and cotangent clusters. The key group in the brain can be regarded as the distribution table of memory fragments. Memory parsing has mirror reflection and is accompanied by the loss of local random data. In the compact compressed time tangent cluster, it freely switches to the high-dimensional information field, and the parsed key is buried in the information.


2006 ◽  
Vol 16 (08) ◽  
pp. 2425-2434 ◽  
Author(s):  
XU LI ◽  
GUANG LI ◽  
LE WANG ◽  
WALTER J. FREEMAN

This paper presents a simulation of a biological olfactory neural system with a KIII set, which is a high-dimensional chaotic neural network. The KIII set differs from conventional artificial neural networks by use of chaotic attractors for memory locations that are accessed by, chaotic trajectories. It was designed to simulate the patterns of action potentials and EEG waveforms observed in electrophysiological experiments, and has proved its utility as a model for biological intelligence in pattern classification. An application to recognition of handwritten numerals is presented here, in which the classification performance of the KIII network under different noise levels was investigated.


2020 ◽  
Vol 16 (11) ◽  
pp. e1008435
Author(s):  
Corey Weistuch ◽  
Luca Agozzino ◽  
Lilianne R. Mujica-Parodi ◽  
Ken A. Dill

We give an approximate solution to the difficult inverse problem of inferring the topology of an unknown network from given time-dependent signals at the nodes. For example, we measure signals from individual neurons in the brain, and infer how they are inter-connected. We use Maximum Caliber as an inference principle. The combinatorial challenge of high-dimensional data is handled using two different approximations to the pairwise couplings. We show two proofs of principle: in a nonlinear genetic toggle switch circuit, and in a toy neural network.


2020 ◽  
pp. 1-11
Author(s):  
Wenjuan Ma ◽  
Xuesi Zhao ◽  
Yuxiu Guo

The application of artificial intelligence and machine learning algorithms in education reform is an inevitable trend of teaching development. In order to improve the teaching intelligence, this paper builds an auxiliary teaching system based on computer artificial intelligence and neural network based on the traditional teaching model. Moreover, in this paper, the optimization strategy is adopted in the TLBO algorithm to reduce the running time of the algorithm, and the extracurricular learning mechanism is introduced to increase the adjustable parameters, which is conducive to the algorithm jumping out of the local optimum. In addition, in this paper, the crowding factor in the fish school algorithm is used to define the degree or restraint of teachers’ control over students. At the same time, students in the crowded range gather near the teacher, and some students who are difficult to restrain perform the following behavior to follow the top students. Finally, this study builds a model based on actual needs, and designs a control experiment to verify the system performance. The results show that the system constructed in this paper has good performance and can provide a theoretical reference for related research.


2021 ◽  
Author(s):  
Takeshi Okanoue ◽  
Toshihide Shima ◽  
Yasuhide Mitsumoto ◽  
Atsushi Umemura ◽  
Kanji Yamaguchi ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document