scholarly journals On Architecture Selection for Linear Inverse Problems with Untrained Neural Networks

Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1481
Author(s):  
Yang Sun ◽  
Hangdong Zhao ◽  
Jonathan Scarlett

In recent years, neural network based image priors have been shown to be highly effective for linear inverse problems, often significantly outperforming conventional methods that are based on sparsity and related notions. While pre-trained generative models are perhaps the most common, it has additionally been shown that even untrained neural networks can serve as excellent priors in various imaging applications. In this paper, we seek to broaden the applicability and understanding of untrained neural network priors by investigating the interaction between architecture selection, measurement models (e.g., inpainting vs. denoising vs. compressive sensing), and signal types (e.g., smooth vs. erratic). We motivate the problem via statistical learning theory, and provide two practical algorithms for tuning architectural hyperparameters. Using experimental evaluations, we demonstrate that the optimal hyperparameters may vary significantly between tasks and can exhibit large performance gaps when tuned for the wrong task. In addition, we investigate which hyperparameters tend to be more important, and which are robust to deviations from the optimum.

Author(s):  
M Venkata Krishna Reddy* ◽  
Pradeep S.

1. Bilal, A. Jourabloo, M. Ye, X. Liu, and L. Ren. Do Convolutional Neural Networks Learn Class Hierarchy? IEEE Transactions on Visualization and Computer Graphics, 24(1):152–162, Jan. 2018. 2. M. Carney, B. Webster, I. Alvarado, K. Phillips, N. Howell, J. Griffith, J. Jongejan, A. Pitaru, and A. Chen. Teachable Machine: Approachable Web-Based Tool for Exploring Machine Learning Classification. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, CHI ’20. ACM, Honolulu, HI, USA, 2020. 3. A. Karpathy. CS231n Convolutional Neural Networks for Visual Recognition, 2016 4. M. Kahng, N. Thorat, D. H. Chau, F. B. Viegas, and M. Wattenberg. GANLab: Understanding Complex Deep Generative Models using Interactive Visual Experimentation. IEEE Transactions on Visualization and Computer Graphics, 25(1):310–320, Jan. 2019. 5. J. Yosinski, J. Clune, A. Nguyen, T. Fuchs, and H. Lipson. Understanding Neural Networks Through Deep Visualization. In ICML Deep Learning Workshop, 2015 6. M. Kahng, P. Y. Andrews, A. Kalro, and D. H. Chau. ActiVis: Visual Exploration of Industry-Scale Deep Neural Network Models. IEEE Transactions on Visualization and Computer Graphics, 24(1):88–97, Jan. 2018. 7. https://cs231n.github.io/convolutional-networks/ 8. https://www.analyticsvidhya.com/blog/2020/02/learn-imageclassification-cnn-convolutional-neural-networks-3-datasets/ 9. https://towardsdatascience.com/understanding-cnn-convolutionalneural- network-69fd626ee7d4 10. https://medium.com/@birdortyedi_23820/deep-learning-lab-episode-2- cifar- 10-631aea84f11e 11. J. Gu, Z. Wang, J. Kuen, L. Ma, A. Shahroudy, B. Shuai, T. Liu, X. Wang, G. Wang, J. Cai, and T. Chen. Recent advances in convolutional neural networks. Pattern Recognition, 77:354–377, May 2018. 12. Hamid, Y., Shah, F.A. and Sugumaram, M. (2014), ―Wavelet neural network model for network intrusion detection system‖, International Journal of Information Technology, Vol. 11 No. 2, pp. 251-263 13. G Sreeram , S Pradeep, K SrinivasRao , B.Deevan Raju , Parveen Nikhat , ― Moving ridge neuronal espionage network simulation for reticulum invasion sensing‖. International Journal of Pervasive Computing and Communications.https://doi.org/10.1108/IJPCC-05- 2020-0036 14. E. Stevens, L. Antiga, and T. Viehmann. Deep Learning with PyTorch. O’Reilly Media, 2019. 15. J. Yosinski, J. Clune, A. Nguyen, T. Fuchs, and H. Lipson. Understanding Neural Networks Through Deep Visualization. In ICML Deep Learning Workshop, 2015. 16. Aman Dureja, Payal Pahwa, ―Analysis of Non-Linear Activation Functions for Classification Tasks Using Convolutional Neural Networks‖, Recent Advances in Computer Science , Vol 2, Issue 3, 2019 ,PP-156-161 17. https://missinglink.ai/guides/neural-network-concepts/7-types-neuralnetwork-activation-functions-right/


Author(s):  
Andrew Lishchytovych ◽  
Volodymyr Pavlenko

The object of this study is to analyse the effectiveness of document ran­ king algorithms in search engines that use artificial neural networks to match the texts. The purpose of the study was to inspect a neural network model of text document ran­ king that uses clustering, factor analysis, and multi-layered network architecture. The work of neural network algorithms was compared with the standard statistical search algorithm OkapiBM25. The result of the study is to evaluate the effectiveness of the use of particular models and to recommend model selection for specific datasets.


2012 ◽  
Vol 485 ◽  
pp. 249-252
Author(s):  
Wei Ya Shi

In this paper, we give a short survey on natural gas load forecasting technology. Because the variation of gas load is influenced by weather and people activity, the traditional forecasting results and the precision is low. The forecasting methods based on statistical learning theory and artificial neural network are discussed in detail. We also give some future research on the forecasting method of gas load.


RSC Advances ◽  
2016 ◽  
Vol 6 (102) ◽  
pp. 99676-99684 ◽  
Author(s):  
Davor Antanasijević ◽  
Jelena Antanasijević ◽  
Viktor Pocajt ◽  
Gordana Ušćumlić

The QSPR study on transition temperatures of five-ring bent-core LCs was performed using GMDH-type neural networks. A novel multi-filter approach, which combines chi square ranking, v-WSH and GMDH algorithm was used for the selection of descriptors.


Author(s):  
Takehiko Ogawa

Network inversion solves inverse problems to estimate cause from result using a multilayer neural network. The original network inversion has been applied to usual multilayer neural networks with real-valued inputs and outputs. The solution by a neural network with complex-valued inputs and outputs is necessary for general inverse problems with complex numbers. In this chapter, we introduce the complex-valued network inversion method to solve inverse problems with complex numbers. In general, difficulties attributable to the ill-posedness of inverse problems appear. Regularization is used to solve this ill-posedness by adding some conditions to the solution. In this chapter, we also explain regularization for complex-valued network inversion.


Crystals ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 258
Author(s):  
Patrick Trampert ◽  
Dmitri Rubinstein ◽  
Faysal Boughorbel ◽  
Christian Schlinkmann ◽  
Maria Luschkova ◽  
...  

The analysis of microscopy images has always been an important yet time consuming process in materials science. Convolutional Neural Networks (CNNs) have been very successfully used for a number of tasks, such as image segmentation. However, training a CNN requires a large amount of hand annotated data, which can be a problem for material science data. We present a procedure to generate synthetic data based on ad hoc parametric data modelling for enhancing generalization of trained neural network models. Especially for situations where it is not possible to gather a lot of data, such an approach is beneficial and may enable to train a neural network reasonably. Furthermore, we show that targeted data generation by adaptively sampling the parameter space of the generative models gives superior results compared to generating random data points.


Sign in / Sign up

Export Citation Format

Share Document