A Neural Architecture Generator for Efficient Search Space

2021 ◽  
Author(s):  
Kun Jing ◽  
Jungang Xu ◽  
Zhen Zhang
2021 ◽  
Vol 54 (4) ◽  
pp. 1-34
Author(s):  
Pengzhen Ren ◽  
Yun Xiao ◽  
Xiaojun Chang ◽  
Po-yao Huang ◽  
Zhihui Li ◽  
...  

Deep learning has made substantial breakthroughs in many fields due to its powerful automatic representation capabilities. It has been proven that neural architecture design is crucial to the feature representation of data and the final performance. However, the design of the neural architecture heavily relies on the researchers’ prior knowledge and experience. And due to the limitations of humans’ inherent knowledge, it is difficult for people to jump out of their original thinking paradigm and design an optimal model. Therefore, an intuitive idea would be to reduce human intervention as much as possible and let the algorithm automatically design the neural architecture. Neural Architecture Search ( NAS ) is just such a revolutionary algorithm, and the related research work is complicated and rich. Therefore, a comprehensive and systematic survey on the NAS is essential. Previously related surveys have begun to classify existing work mainly based on the key components of NAS: search space, search strategy, and evaluation strategy. While this classification method is more intuitive, it is difficult for readers to grasp the challenges and the landmark work involved. Therefore, in this survey, we provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms, and then providing solutions for subsequent related research work. In addition, we conduct a detailed and comprehensive analysis, comparison, and summary of these works. Finally, we provide some possible future research directions.


2020 ◽  
Vol 10 (11) ◽  
pp. 3712
Author(s):  
Dongjing Shan ◽  
Xiongwei Zhang ◽  
Wenhua Shi ◽  
Li Li

Regarding the sequence learning of neural networks, there exists a problem of how to capture long-term dependencies and alleviate the gradient vanishing phenomenon. To manage this problem, we proposed a neural network with random connections via a scheme of a neural architecture search. First, a dense network was designed and trained to construct a search space, and then another network was generated by random sampling in the space, whose skip connections could transmit information directly over multiple periods and capture long-term dependencies more efficiently. Moreover, we devised a novel cell structure that required less memory and computational power than the structures of long short-term memories (LSTMs), and finally, we performed a special initialization scheme on the cell parameters, which could permit unhindered gradient propagation on the time axis at the beginning of training. In the experiments, we evaluated four sequential tasks: adding, copying, frequency discrimination, and image classification; we also adopted several state-of-the-art methods for comparison. The experimental results demonstrated that our proposed model achieved the best performance.


2000 ◽  
Vol 30 (1) ◽  
pp. 37-53 ◽  
Author(s):  
Kris Demuynck ◽  
Jacques Duchateau ◽  
Dirk Van Compernolle ◽  
Patrick Wambacq

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Keith G. Mills ◽  
Mohammad Salameh ◽  
Di Niu ◽  
Fred X. Han ◽  
Seyed Saeed Changiz Rezaei ◽  
...  

Author(s):  
Xiaoyang Gao ◽  
Sriram Krishnamoorthy ◽  
Swarup Kumar Sahoo ◽  
Chi-Chung Lam ◽  
Gerald Baumgartner ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document